4 Commits
v1.0.1 ... main

Author SHA1 Message Date
Hari Nair
9e00d48789 Merge pull request #6 from JHUAPL/5-update-to-v110
merge 5 update to v110
2025-07-30 12:13:27 -04:00
Hari Nair
72c19fb941 apply palantir formatting 2025-07-30 12:07:15 -04:00
Hari Nair
6212144215 update dependencies 2025-07-30 12:02:54 -04:00
Hari Nair
f826c9acb8 updates for v1.1.0 2025-07-30 11:24:30 -04:00
226 changed files with 36729 additions and 36147 deletions

View File

@@ -1,5 +1,20 @@
# Terrasaur Changelog
## July 30, 2025 - v1.1.0
- Updates to existing tools
- AdjustShapeModelToOtherShapeModel
- fix intersection bug
- CreateSBMTStructure
- new options: -flipX, -flipY, -spice, -date, -observer, -target, -cameraFrame
- ValidateNormals
- new option: -fast to only check for overhangs if center and normal point in opposite directions
- New tools:
- FacetInfo: Print info about a facet
- PointCloudOverlap: Find points in a point cloud which overlap a reference point cloud
- TriAx: Generate a triaxial ellipsoid in ICQ format
## April 28, 2025 - v1.0.1
- Add MIT license to repository and source code

Binary file not shown.

Binary file not shown.

View File

@@ -16,7 +16,7 @@ The Terrasaur package requires Java 21 or later. Some freely available versions
Download
~~~~~~~~
Binary packages for use on Mac OS X and Linux are available at ...
Binary packages for use on Mac OS X and Linux are available at `GitHub <https://github.com/JHUAPL/Terrasaur/releases>`__.
We have not tried using the softare on Microsoft Windows, but users may try the Linux package with the `Windows Subsystem for Linux <https://docs.microsoft.com/en-us/windows/wsl/>`__.

View File

@@ -1,35 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

70
doc/tools/ColorSpots.rst Normal file
View File

@@ -0,0 +1,70 @@
.. _ColorSpots:
##########
ColorSpots
##########
ColorSpots takes as input a shape model and a file containing (x, y, z, value),
or (lat, lon, value). It writes out the mean and standard deviation of values
within a specified range for each facet.
.. include:: ../toolDescriptions/ColorSpots.txt
:literal:
********
Examples
********
Download the :download:`Apophis<./support_files/apophis_g_15618mm_rdr_obj_0000n00000_v001.obj>`
shape model and the :download:`info<./support_files/xyzrandom.txt>` file containing
cartesian coordinates and a random value.
Run ColorSpots:
::
ColorSpots -obj apophis_g_15618mm_rdr_obj_0000n00000_v001.obj -xyz \
-info xyzrandom.txt -outFile apophis_value_at_vertex.csv -noWeight \
-allFacets -additionalFields n -searchRadius 0.015 -writeVertices
The first few lines of apophis_value_at_vertex.csv look like:
::
% head apophis_value_at_vertex.csv
0.000000e+00, 0.000000e+00, 1.664960e-01, -3.805764e-02, 5.342315e-01, 4.000000e+01
1.589500e-02, 0.000000e+00, 1.591030e-01, 6.122849e-02, 6.017192e-01, 5.000000e+01
7.837000e-03, 1.486800e-02, 1.591670e-01, -6.072964e-03, 5.220682e-01, 5.700000e+01
-7.747000e-03, 1.506300e-02, 1.621040e-01, 9.146163e-02, 5.488631e-01, 4.900000e+01
-1.554900e-02, 0.000000e+00, 1.657970e-01, -8.172811e-03, 5.270302e-01, 3.400000e+01
-7.982000e-03, -1.571100e-02, 1.694510e-01, -2.840524e-02, 5.045911e-01, 3.900000e+01
8.060000e-03, -1.543300e-02, 1.655150e-01, 3.531959e-02, 5.464390e-01, 4.900000e+01
3.179500e-02, 0.000000e+00, 1.515820e-01, -1.472434e-02, 5.967265e-01, 5.400000e+01
2.719700e-02, 1.658200e-02, 1.508930e-01, -9.050683e-03, 5.186966e-01, 4.700000e+01
1.554100e-02, 2.901300e-02, 1.530770e-01, -7.053547e-02, 4.980369e-01, 7.000000e+01
The columns are:
.. list-table:: ColorSpots Vertex output
:header-rows: 1
* - Column
- Value
* - 1
- X
* - 2
- Y
* - 3
- Z
* - 4
- mean value in region
* - 5
- standard deviation in region
* - 6
- number of points in region
.. figure:: images/ColorSpots-n.png
:alt: Number of points in region at each vertex
This image shows the number of points in the region at each vertex.

View File

@@ -23,7 +23,7 @@ Local Model Comparison
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
shape models. You can view them in a tool such as
`ParaView<https://www.paraview.org/>`.
`ParaView <https://www.paraview.org/>`__.
.. figure:: images/CompareOBJ_local_1.png
@@ -32,8 +32,8 @@ shape models. You can view them in a tool such as
Run CompareOBJ to find the optimal transform to align the comparison with the reference:
::
CompareOBJ -computeOptimalRotationAndTranslation -model F3H-1/EVAL20.obj \
-reference F3H-1/EVAL20_wtr.obj -computeVerticalError verticalError.txt \
CompareOBJ -computeOptimalRotationAndTranslation -model EVAL20.obj \
-reference EVAL20_wtr.obj -computeVerticalError verticalError.txt \
-saveOptimalShape optimal.obj -savePlateDiff plateDiff.txt -savePlateIndex plateIndex.txt
The screen output is
@@ -77,7 +77,7 @@ model for comparison:
::
ShapeFormatConverter -input Bennu/Bennu49k.obj -output BennuComparison.obj \
ShapeFormatConverter -input Bennu49k.obj -output BennuComparison.obj \
-rotate 5,0,0,1 -translate 0.01,-0.01,0.01
This rotates the shape model by 5 degrees about the z axis and then translates
@@ -94,11 +94,11 @@ Run CompareOBJ to find the optimal transform to align the comparison with the re
CompareOBJ -computeOptimalRotationAndTranslation \
-model BennuComparison.obj \
-reference Bennu/Bennu49k.obj \
-computeVerticalError CompareOBJ/terrasaur-verticalError.txt \
-saveOptimalShape CompareOBJ/terrasaur-optimal.obj \
-savePlateDiff CompareOBJ/terrasaur-plateDiff.txt \
-savePlateIndex CompareOBJ/terrasaur-plateIndex.txt
-reference Bennu49k.obj \
-computeVerticalError terrasaur-verticalError.txt \
-saveOptimalShape terrasaur-optimal.obj \
-savePlateDiff terrasaur-plateDiff.txt \
-savePlateIndex terrasaur-plateIndex.txt
The screen output is

View File

@@ -0,0 +1,38 @@
.. _PointCloudOverlap:
#################
PointCloudOverlap
#################
*****
Usage
*****
.. include:: ../toolDescriptions/PointCloudOverlap.txt
:literal:
********
Examples
********
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
shape models. You can view them in a tool such as
`ParaView <https://www.paraview.org/>`__.
.. figure:: images/PointCloudOverlap_1.png
This image shows the reference (pink) and input (blue) shape models.
Run PointCloudOverlap:
::
PointCloudOverlap -inputFile EVAL20.obj -referenceFile EVAL20_wtr.obj -outputFile overlap.vtk
Note that OBJ is supported as an input format but not as an output format.
.. figure:: images/PointCloudOverlap_2.png
The points in white are those in the input model that overlap the reference.

43
doc/tools/TriAx.rst Normal file
View File

@@ -0,0 +1,43 @@
.. _TriAx:
=====
TriAx
=====
TriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.
*****
Usage
*****
.. include:: ../toolDescriptions/TriAx.txt
:literal:
*******
Example
*******
Generate an ellipsoid with dimensions 10, 8, 6, with q = 8.
::
TriAx -A 10 -B 8 -C 6 -Q 8 -output triax.icq -saveOBJ
The following ellipsoid is generated:
.. container:: figures-row
.. figure:: images/TriAx_X.png
:alt: looking down from the +X direction
looking down from the +X direction
.. figure:: images/TriAx_Y.png
:alt: looking down from the +Y direction
looking down from the +Y direction
.. figure:: images/TriAx_Z.png
:alt: looking down from the +Z direction
looking down from the +Z direction

Binary file not shown.

After

Width:  |  Height:  |  Size: 174 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 139 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

View File

@@ -89,7 +89,7 @@ function build_jar() {
function make_scripts() {
classes=$(jar tf "${scriptPath}"/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep class)
classes=$(jar tf ${scriptPath}/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep -v "Immutable" | grep class)
for class in $classes; do
base=$(basename "$class" ".class")

88
pom.xml
View File

@@ -1,5 +1,6 @@
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>edu.jhuapl.ses.srn</groupId>
<artifactId>Terrasaur</artifactId>
@@ -42,7 +43,7 @@
<maven.compiler.release>21</maven.compiler.release>
<javafx.version>21.0.5</javafx.version>
<immutables.version>2.10.1</immutables.version>
<immutables.version>2.11.1</immutables.version>
<jackfruit.version>1.1.1</jackfruit.version>
</properties>
@@ -55,25 +56,29 @@
<groupId>com.mycila</groupId>
<artifactId>license-maven-plugin</artifactId>
<version>5.0.0</version>
<configuration>
<licenseSets>
<licenseSet>
<header>com/mycila/maven/plugin/license/templates/MIT.txt</header>
<includes>
<include>**/*.java</include>
</includes>
</licenseSet>
</licenseSets>
</configuration>
<executions>
<execution>
<id>check-license</id>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<licenseSets>
<licenseSet>
<header>com/mycila/maven/plugin/license/templates/MIT.txt</header>
<includes>
<include>**/*.java</include>
</includes>
<excludes>
<exclude>3rd-party/**/*.java</exclude>
<exclude>support-libraries/3rd-party/**/*.java</exclude>
</excludes>
</licenseSet>
</licenseSets>
</configuration>
<executions>
<execution>
<id>check-license</id>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
@@ -126,14 +131,15 @@
<artifactId>maven-surefire-plugin</artifactId>
<version>3.5.3</version>
<configuration>
<argLine> -Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
<argLine>
-Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
</argLine>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>3.5.0</version>
<version>3.6.1</version>
<executions>
<execution>
<id>enforce-maven</id>
@@ -207,7 +213,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>3.5.0</version>
<version>3.5.1</version>
<executions>
<execution>
<phase>generate-sources</phase>
@@ -222,6 +228,18 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.diffplug.spotless</groupId>
<artifactId>spotless-maven-plugin</artifactId>
<version>2.46.1</version>
<configuration>
<java>
<palantirJavaFormat />
</java>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
@@ -264,7 +282,7 @@
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.10.0</version>
<version>1.11.0</version>
</dependency>
<dependency>
<groupId>commons-cli</groupId>
@@ -274,17 +292,22 @@
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-configuration2</artifactId>
<version>2.11.0</version>
<version>2.12.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.13.0</version>
<version>1.14.1</version>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.18.0</version>
<version>2.20.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.14.0</version>
</dependency>
<dependency>
<groupId>com.beust</groupId>
@@ -294,7 +317,7 @@
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.12.1</version>
<version>2.13.1</version>
</dependency>
<dependency>
@@ -315,18 +338,13 @@
<dependency>
<groupId>gov.nasa.gsfc.heasarc</groupId>
<artifactId>nom-tam-fits</artifactId>
<version>1.20.2</version>
<version>1.21.1</version>
</dependency>
<dependency>
<groupId>gov.nasa.jpl.naif</groupId>
<artifactId>spice</artifactId>
<version>N0067</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>

View File

@@ -29,319 +29,314 @@ import terrasaur.utils.StringUtil;
public class ALTWGProductNamer implements ProductNamer {
public ALTWGProductNamer() {
super();
}
/**
* Parse the productName and return the portion of the name corresponding to a given field. Fields
* are assumed separated by "_" in the filename.
*
* @param productName
* @param fieldNum
* @return
*/
@Override
public String getNameFrag(String productName, int fieldNum) {
String[] fields = productName.split("_");
String returnField = "ERROR";
if (fieldNum > fields.length) {
System.out.println(
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField);
} else {
returnField = fields[fieldNum];
}
return returnField;
}
@Override
public String productbaseName(
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
String gsd = "gsd";
String dataSrc = "dataSrc";
String productType = altwgProduct.getFileFrag();
String prodVer = getVersion(hdrBuilder);
// extract ground sample distance. gsdD is in mm!
double gsdD = gsdFromHdr(hdrBuilder);
int gsdI = (int) Math.round(gsdD);
String fileUnits = "mm";
gsd = String.format("%05d", gsdI) + fileUnits;
// System.out.println("gsd:" + gsd);
// System.out.println("file units:" + fileUnits);
HeaderTag key = HeaderTag.DATASRC;
if (hdrBuilder.containsKey(key)) {
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
// check whether dataSrc needs to be modified
dataSrc = HeaderTag.getSDP(dataSrc);
// data source should only be 3 chars long
if (dataSrc.length() > 3) {
dataSrc = dataSrc.substring(0, 3);
}
public ALTWGProductNamer() {
super();
}
key = HeaderTag.CLON;
String cLon = null;
if (hdrBuilder.containsKey(key)) {
cLon = hdrBuilder.getCard(key).getValue();
}
if (cLon == null) {
if (isGlobal) {
// set center longitude to 0.0 if value not parsed and this is a global product
cLon = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLON from fits header!";
throw new RuntimeException(errMesg);
}
}
// System.out.println("clon:" + cLon);
key = HeaderTag.CLAT;
String cLat = null;
if (hdrBuilder.containsKey(key)) {
cLat = hdrBuilder.getCard(key).getValue();
}
if (cLat == null) {
if (isGlobal) {
// set center latitude to 0.0 if value not parsed and this is a global product
cLat = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLAT from fits header!";
throw new RuntimeException(errMesg);
}
}
// System.out.println("clat" + cLat);
/**
* Parse the productName and return the portion of the name corresponding to a given field. Fields
* are assumed separated by "_" in the filename.
*
* @param productName
* @param fieldNum
* @return
*/
@Override
public String getNameFrag(String productName, int fieldNum) {
String region = "l";
if (isGlobal) {
region = "g";
}
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
// pds likes having filenames all in the same case, so chose lowercase
String outFile =
ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
return outFile;
}
/**
* Retrieve the product version string. Returns initial default value if product version keyword
* not found in builder.
*
* @param hdrBuilder
*/
@Override
public String getVersion(FitsHdrBuilder hdrBuilder) {
String prodVer = "prodVer";
// note: this has been changed to MAP_VER in the SIS
HeaderTag key = HeaderTag.MAP_VER;
// key = HeaderTag.PRODVERS;
if (hdrBuilder.containsKey(key)) {
prodVer = hdrBuilder.getCard(key).getValue();
prodVer = prodVer.replaceAll("\\.", "");
}
return prodVer;
}
// Given the fields return the altwg PDS base name
public static String altwgBaseName(
String region, String gsd, String dataSource, String desc, String lahLon, String version) {
StringBuilder sb = new StringBuilder();
String delim = "_";
sb.append(region);
sb.append(delim);
sb.append(gsd);
sb.append(delim);
// data source should only be 3 characters long
if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3);
System.out.println(
"Will set data source to:"
+ dataSource
+ " but"
+ " this might NOT conform to the ALTWG naming convention!");
}
sb.append(dataSource);
sb.append(delim);
sb.append(desc);
sb.append(delim);
sb.append(lahLon);
sb.append(delim);
sb.append("v");
// remove '.' from version string
version = version.replaceAll("\\.", "");
sb.append(version);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = sb.toString().toLowerCase();
return outFile;
}
/**
* Parse center lat, lon strings and return the formatted clahLon portion of the PDS filename.
*
* @param clat
* @param clon
* @return
*/
public static String clahLon(String clat, String clon) {
// String cLon = "";
// remove all whitespace that may exist in the strings
clat = clat.replaceAll("\\s+", "");
clon = clon.replaceAll("\\s+", "");
double cLonD = StringUtil.parseSafeD(clon);
double cLatD = StringUtil.parseSafeD(clat);
String clahLon = clahLon(cLatD, cLonD);
return clahLon;
}
public static String clahLon(double cLatD, double cLonD) {
String cLon = "";
if (cLonD == Double.NaN) {
// unable to parse center longitude using normal method (see V in getProductCards())
cLon = "xxxxxx";
} else {
if (cLonD < 0) {
// transform to 0-360
cLonD = cLonD + 360D;
}
// format double to 2 significant digits
cLon = String.format("%06.2f", cLonD);
}
// remove decimal point
cLon = cLon.replace(".", "");
String cLat = "";
// System.out.println("cLatD:" + Double.toString(cLatD));
if (cLatD == Double.NaN) {
// unable to parse center latitude
cLat = "xxxxxx";
} else {
double tol = 0.0101D;
// determine whether latitude is within tolerance of its rounded value.
// if so then use rounded value
double roundValue = Math.round(cLatD);
double diffTol = Math.abs(roundValue - cLatD);
if (diffTol < tol) {
cLatD = roundValue;
}
String hemiSphere = (cLatD >= 0) ? "N" : "S";
if (cLatD < 0) {
// remove negative sign if in southern hemisphere
cLatD = cLatD * -1.0D;
}
// format cLat to 2 significant digits
cLat = String.format("%05.2f", cLatD);
cLat = cLat.replace(".", "");
// trim to length 4.
cLat = cLat.substring(0, Math.min(cLat.length(), 4));
cLat = cLat + hemiSphere;
}
String clahLon = cLat + cLon;
return clahLon;
}
/**
* return GSD parsed from FitsHdrBuilder. Returns 0 if valid GSD could not be parsed. GSD is in
* mm.
*
* @param hdrBuilder
* @return
*/
@Override
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
String gsd = "gsd";
double gsdD = Double.NaN;
// extract ground sample distance using GSD first
HeaderTag key = HeaderTag.GSD;
if (hdrBuilder.containsKey(key)) {
gsd = hdrBuilder.getCard(key).getValue();
gsdD = StringUtil.parseSafeD(gsd);
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSD not set!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// could not parse GSD into valid number, try GSDI
key = HeaderTag.GSDI;
if (hdrBuilder.containsKey(key)) {
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSDI not set!");
String[] fields = productName.split("_");
String returnField = "ERROR";
if (fieldNum > fields.length) {
System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField);
} else {
returnField = fields[fieldNum];
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999
System.out.println(
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999");
gsdD = -999D;
}
return returnField;
}
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
@Override
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
// mandated to use mm! change the units
gsdD = gsdD * 10.0D;
String gsd = "gsd";
String dataSrc = "dataSrc";
String productType = altwgProduct.getFileFrag();
String prodVer = getVersion(hdrBuilder);
// extract ground sample distance. gsdD is in mm!
double gsdD = gsdFromHdr(hdrBuilder);
int gsdI = (int) Math.round(gsdD);
String fileUnits = "mm";
gsd = String.format("%05d", gsdI) + fileUnits;
// System.out.println("gsd:" + gsd);
// System.out.println("file units:" + fileUnits);
HeaderTag key = HeaderTag.DATASRC;
if (hdrBuilder.containsKey(key)) {
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
// check whether dataSrc needs to be modified
dataSrc = HeaderTag.getSDP(dataSrc);
// data source should only be 3 chars long
if (dataSrc.length() > 3) {
dataSrc = dataSrc.substring(0, 3);
}
}
key = HeaderTag.CLON;
String cLon = null;
if (hdrBuilder.containsKey(key)) {
cLon = hdrBuilder.getCard(key).getValue();
}
if (cLon == null) {
if (isGlobal) {
// set center longitude to 0.0 if value not parsed and this is a global product
cLon = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLON from fits header!";
throw new RuntimeException(errMesg);
}
}
// System.out.println("clon:" + cLon);
key = HeaderTag.CLAT;
String cLat = null;
if (hdrBuilder.containsKey(key)) {
cLat = hdrBuilder.getCard(key).getValue();
}
if (cLat == null) {
if (isGlobal) {
// set center latitude to 0.0 if value not parsed and this is a global product
cLat = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLAT from fits header!";
throw new RuntimeException(errMesg);
}
}
// System.out.println("clat" + cLat);
String region = "l";
if (isGlobal) {
region = "g";
}
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
return outFile;
}
return gsdD;
}
/**
* Retrieve the product version string. Returns initial default value if product version keyword
* not found in builder.
*
* @param hdrBuilder
*/
@Override
public String getVersion(FitsHdrBuilder hdrBuilder) {
String prodVer = "prodVer";
@Override
public NameConvention getNameConvention() {
return NameConvention.ALTPRODUCT;
}
// note: this has been changed to MAP_VER in the SIS
HeaderTag key = HeaderTag.MAP_VER;
// key = HeaderTag.PRODVERS;
if (hdrBuilder.containsKey(key)) {
prodVer = hdrBuilder.getCard(key).getValue();
prodVer = prodVer.replaceAll("\\.", "");
}
@Override
/** Parse the filename using the ALTWGProduct naming convention and return the GSD value. */
public double gsdFromFilename(String filename) {
String[] splitStr = filename.split("_");
// GSD is second element
String gsd = splitStr[1];
gsd = gsd.replace("mm", "");
return StringUtil.parseSafeD(gsd);
}
return prodVer;
}
// Given the fields return the altwg PDS base name
public static String altwgBaseName(
String region, String gsd, String dataSource, String desc, String lahLon, String version) {
StringBuilder sb = new StringBuilder();
String delim = "_";
sb.append(region);
sb.append(delim);
sb.append(gsd);
sb.append(delim);
// data source should only be 3 characters long
if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3);
System.out.println("Will set data source to:"
+ dataSource
+ " but"
+ " this might NOT conform to the ALTWG naming convention!");
}
sb.append(dataSource);
sb.append(delim);
sb.append(desc);
sb.append(delim);
sb.append(lahLon);
sb.append(delim);
sb.append("v");
// remove '.' from version string
version = version.replaceAll("\\.", "");
sb.append(version);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = sb.toString().toLowerCase();
return outFile;
}
/**
* Parse center lat, lon strings and return the formatted clahLon portion of the PDS filename.
*
* @param clat
* @param clon
* @return
*/
public static String clahLon(String clat, String clon) {
// String cLon = "";
// remove all whitespace that may exist in the strings
clat = clat.replaceAll("\\s+", "");
clon = clon.replaceAll("\\s+", "");
double cLonD = StringUtil.parseSafeD(clon);
double cLatD = StringUtil.parseSafeD(clat);
String clahLon = clahLon(cLatD, cLonD);
return clahLon;
}
public static String clahLon(double cLatD, double cLonD) {
String cLon = "";
if (cLonD == Double.NaN) {
// unable to parse center longitude using normal method (see V in getProductCards())
cLon = "xxxxxx";
} else {
if (cLonD < 0) {
// transform to 0-360
cLonD = cLonD + 360D;
}
// format double to 2 significant digits
cLon = String.format("%06.2f", cLonD);
}
// remove decimal point
cLon = cLon.replace(".", "");
String cLat = "";
// System.out.println("cLatD:" + Double.toString(cLatD));
if (cLatD == Double.NaN) {
// unable to parse center latitude
cLat = "xxxxxx";
} else {
double tol = 0.0101D;
// determine whether latitude is within tolerance of its rounded value.
// if so then use rounded value
double roundValue = Math.round(cLatD);
double diffTol = Math.abs(roundValue - cLatD);
if (diffTol < tol) {
cLatD = roundValue;
}
String hemiSphere = (cLatD >= 0) ? "N" : "S";
if (cLatD < 0) {
// remove negative sign if in southern hemisphere
cLatD = cLatD * -1.0D;
}
// format cLat to 2 significant digits
cLat = String.format("%05.2f", cLatD);
cLat = cLat.replace(".", "");
// trim to length 4.
cLat = cLat.substring(0, Math.min(cLat.length(), 4));
cLat = cLat + hemiSphere;
}
String clahLon = cLat + cLon;
return clahLon;
}
/**
* return GSD parsed from FitsHdrBuilder. Returns 0 if valid GSD could not be parsed. GSD is in
* mm.
*
* @param hdrBuilder
* @return
*/
@Override
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
String gsd = "gsd";
double gsdD = Double.NaN;
// extract ground sample distance using GSD first
HeaderTag key = HeaderTag.GSD;
if (hdrBuilder.containsKey(key)) {
gsd = hdrBuilder.getCard(key).getValue();
gsdD = StringUtil.parseSafeD(gsd);
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSD not set!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// could not parse GSD into valid number, try GSDI
key = HeaderTag.GSDI;
if (hdrBuilder.containsKey(key)) {
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSDI not set!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999
System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999");
gsdD = -999D;
}
}
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
// mandated to use mm! change the units
gsdD = gsdD * 10.0D;
}
return gsdD;
}
@Override
public NameConvention getNameConvention() {
return NameConvention.ALTPRODUCT;
}
@Override
/** Parse the filename using the ALTWGProduct naming convention and return the GSD value. */
public double gsdFromFilename(String filename) {
String[] splitStr = filename.split("_");
// GSD is second element
String gsd = splitStr[1];
gsd = gsd.replace("mm", "");
return StringUtil.parseSafeD(gsd);
}
}

View File

@@ -29,184 +29,182 @@ import terrasaur.utils.StringUtil;
public class AltwgMLNNamer implements ProductNamer {
public AltwgMLNNamer() {
super();
}
@Override
public String getNameFrag(String productName, int fieldNum) {
String nameFrag = "";
return nameFrag;
}
/**
* ALTWG MLN naming convention applies only to one productType - the ALTWG NFT-MLN.
*
* @param hdrBuilder - contains values that are used to create the MLN according to naming
* convention.
* @param altwgProduct - N/A. Included here as part of the interface structure.
* @param isGlobal - N/A. Included here as part of the interface structure.
*/
@Override
public String productbaseName(
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
// initialize string fragments for NFT name. This will help identify
// which string fragments have not been updated by the method.
String gsd = "gsd";
String dataSrc = "dataSrc";
String dataSrcfile = "dataSrcFile";
String productType = "nftdtm";
String cLon = "cLon";
String cLat = "cLat";
String prodVer = "prodVer";
String productID = "prodID";
// find relevant information in the hdrBuilder map.
double gsdD = gsdFromHdr(hdrBuilder);
int gsdI = (int) Math.round(gsdD);
String fileUnits = "mm";
gsd = String.format("%05d", gsdI) + fileUnits;
// HeaderTag key = HeaderTag.GSD;
// if (hdrBuilder.containsKey(key)) {
// gsd = hdrBuilder.getCard(key).getValue();
//
// double gsdD = Double.parseDouble(gsd);
// String fileUnits = "";
// if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
// fileUnits = "mm";
// } else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
//
// // mandated to use mm! change the units
// gsdD = gsdD * 10.0D;
// gsdI = (int) Math.round(gsdD);
// }
//
// System.out.println("gsd:" + gsd);
// System.out.println("file units:" + fileUnits);
//
// }
HeaderTag key = HeaderTag.DATASRC;
if (hdrBuilder.containsKey(key)) {
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
// data source should only be 3 chars long
if (dataSrc.length() > 3) {
dataSrc = dataSrc.substring(0, 3);
}
public AltwgMLNNamer() {
super();
}
key = HeaderTag.CLON;
if (hdrBuilder.containsKey(key)) {
cLon = hdrBuilder.getCard(key).getValue();
@Override
public String getNameFrag(String productName, int fieldNum) {
String nameFrag = "";
return nameFrag;
}
key = HeaderTag.CLAT;
if (hdrBuilder.containsKey(key)) {
cLat = hdrBuilder.getCard(key).getValue();
}
key = HeaderTag.DATASRCF;
if (hdrBuilder.containsKey(key)) {
dataSrcfile = hdrBuilder.getCard(key).getValue();
}
key = HeaderTag.PRODVERS;
if (hdrBuilder.containsKey(key)) {
prodVer = hdrBuilder.getCard(key).getValue();
prodVer = prodVer.replaceAll("\\.", "");
}
// hardcode region to local
String region = "l";
StringBuilder sb = new StringBuilder();
String delim = "_";
sb.append(region);
sb.append(delim);
sb.append(gsd);
sb.append(delim);
sb.append(dataSrc);
sb.append(delim);
sb.append(productType);
sb.append(delim);
/*
* determine product ID. For OLA it is the center lat,lon For SPC it is the NFT feature id,
* which is assumed to be the first 5 chars in DATASRC
/**
* ALTWG MLN naming convention applies only to one productType - the ALTWG NFT-MLN.
*
* @param hdrBuilder - contains values that are used to create the MLN according to naming
* convention.
* @param altwgProduct - N/A. Included here as part of the interface structure.
* @param isGlobal - N/A. Included here as part of the interface structure.
*/
if (dataSrc.contains("ola")) {
@Override
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
// follow ALTWG product naming convention for center lat, lon
productID = ALTWGProductNamer.clahLon(cLat, cLon);
} else {
productID = dataSrcfile.substring(0, 5);
}
sb.append(productID);
sb.append(delim);
sb.append("v");
sb.append(prodVer);
// initialize string fragments for NFT name. This will help identify
// which string fragments have not been updated by the method.
String gsd = "gsd";
String dataSrc = "dataSrc";
String dataSrcfile = "dataSrcFile";
String productType = "nftdtm";
String cLon = "cLon";
String cLat = "cLat";
String prodVer = "prodVer";
String productID = "prodID";
return sb.toString().toLowerCase();
}
// find relevant information in the hdrBuilder map.
double gsdD = gsdFromHdr(hdrBuilder);
int gsdI = (int) Math.round(gsdD);
String fileUnits = "mm";
gsd = String.format("%05d", gsdI) + fileUnits;
// HeaderTag key = HeaderTag.GSD;
// if (hdrBuilder.containsKey(key)) {
// gsd = hdrBuilder.getCard(key).getValue();
//
// double gsdD = Double.parseDouble(gsd);
// String fileUnits = "";
// if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
// fileUnits = "mm";
// } else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
//
// // mandated to use mm! change the units
// gsdD = gsdD * 10.0D;
// gsdI = (int) Math.round(gsdD);
// }
//
// System.out.println("gsd:" + gsd);
// System.out.println("file units:" + fileUnits);
//
// }
@Override
public String getVersion(FitsHdrBuilder hdrBuilder) {
HeaderTag key = HeaderTag.DATASRC;
if (hdrBuilder.containsKey(key)) {
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
// data source should only be 3 chars long
if (dataSrc.length() > 3) {
dataSrc = dataSrc.substring(0, 3);
}
}
String version = "";
key = HeaderTag.CLON;
if (hdrBuilder.containsKey(key)) {
cLon = hdrBuilder.getCard(key).getValue();
}
return version;
}
key = HeaderTag.CLAT;
if (hdrBuilder.containsKey(key)) {
cLat = hdrBuilder.getCard(key).getValue();
}
/**
* Extract ground sample distance from FitsHdrBuilder. GSD is needed as part of naming convention.
* GSD in units of mm.
*/
@Override
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
key = HeaderTag.DATASRCF;
if (hdrBuilder.containsKey(key)) {
dataSrcfile = hdrBuilder.getCard(key).getValue();
}
// find relevant information in the hdrBuilder map.
double gsdD = Double.NaN;
HeaderTag key = HeaderTag.GSD;
if (hdrBuilder.containsKey(key)) {
String gsd = hdrBuilder.getCard(key).getValue();
key = HeaderTag.PRODVERS;
if (hdrBuilder.containsKey(key)) {
prodVer = hdrBuilder.getCard(key).getValue();
prodVer = prodVer.replaceAll("\\.", "");
}
gsdD = Double.parseDouble(gsd);
String fileUnits = "";
if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
fileUnits = "mm";
} else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
// hardcode region to local
String region = "l";
// mandated to use mm! change the units
gsdD = gsdD * 10.0D;
fileUnits = "mm";
}
System.out.println("gsd:" + gsd);
System.out.println("file units:" + fileUnits);
StringBuilder sb = new StringBuilder();
String delim = "_";
sb.append(region);
sb.append(delim);
sb.append(gsd);
sb.append(delim);
sb.append(dataSrc);
sb.append(delim);
sb.append(productType);
sb.append(delim);
} else {
String errMesg =
"ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
throw new RuntimeException(errMesg);
/*
* determine product ID. For OLA it is the center lat,lon For SPC it is the NFT feature id,
* which is assumed to be the first 5 chars in DATASRC
*/
if (dataSrc.contains("ola")) {
// follow ALTWG product naming convention for center lat, lon
productID = ALTWGProductNamer.clahLon(cLat, cLon);
} else {
productID = dataSrcfile.substring(0, 5);
}
sb.append(productID);
sb.append(delim);
sb.append("v");
sb.append(prodVer);
return sb.toString().toLowerCase();
}
return gsdD;
}
@Override
public String getVersion(FitsHdrBuilder hdrBuilder) {
@Override
public NameConvention getNameConvention() {
return NameConvention.ALTNFTMLN;
}
String version = "";
@Override
/** Parse the filename using the ALTWG MLN naming convention and return the GSD value. */
public double gsdFromFilename(String filename) {
String[] splitStr = filename.split("_");
// GSD is second element
String gsd = splitStr[1];
gsd = gsd.replace("mm", "");
return StringUtil.parseSafeD(gsd);
}
return version;
}
/**
* Extract ground sample distance from FitsHdrBuilder. GSD is needed as part of naming convention.
* GSD in units of mm.
*/
@Override
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
// find relevant information in the hdrBuilder map.
double gsdD = Double.NaN;
HeaderTag key = HeaderTag.GSD;
if (hdrBuilder.containsKey(key)) {
String gsd = hdrBuilder.getCard(key).getValue();
gsdD = Double.parseDouble(gsd);
String fileUnits = "";
if (hdrBuilder.getCard(key).getComment().contains("[mm]")) {
fileUnits = "mm";
} else if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
// mandated to use mm! change the units
gsdD = gsdD * 10.0D;
fileUnits = "mm";
}
System.out.println("gsd:" + gsd);
System.out.println("file units:" + fileUnits);
} else {
String errMesg = "ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
throw new RuntimeException(errMesg);
}
return gsdD;
}
@Override
public NameConvention getNameConvention() {
return NameConvention.ALTNFTMLN;
}
@Override
/** Parse the filename using the ALTWG MLN naming convention and return the GSD value. */
public double gsdFromFilename(String filename) {
String[] splitStr = filename.split("_");
// GSD is second element
String gsd = splitStr[1];
gsd = gsd.replace("mm", "");
return StringUtil.parseSafeD(gsd);
}
}

View File

@@ -36,291 +36,287 @@ import terrasaur.utils.StringUtil;
*/
public class DartNamer implements ProductNamer {
public static String getBaseName(Map<NameFields, String> nameFragments) {
public static String getBaseName(Map<NameFields, String> nameFragments) {
// check to see if the map contains all the fragments needed. Throw runtimeexception if it
// doesn't
validateMap(nameFragments);
// check to see if the map contains all the fragments needed. Throw runtimeexception if it
// doesn't
validateMap(nameFragments);
StringBuilder sb = new StringBuilder();
String delim = "_";
sb.append(nameFragments.get(NameFields.REGION));
sb.append(delim);
sb.append(nameFragments.get(NameFields.GSD));
sb.append(delim);
StringBuilder sb = new StringBuilder();
String delim = "_";
sb.append(nameFragments.get(NameFields.REGION));
sb.append(delim);
sb.append(nameFragments.get(NameFields.GSD));
sb.append(delim);
// data source should only be 3 characters long
String dataSource = nameFragments.get(NameFields.DATASRC);
if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3);
System.out.println(
"Will set data source to:"
+ dataSource
+ " but"
+ " this might NOT conform to the ALTWG naming convention!");
}
sb.append(dataSource);
sb.append(delim);
sb.append(nameFragments.get(NameFields.DATATYPE));
sb.append(delim);
sb.append(nameFragments.get(NameFields.TBODY));
sb.append(delim);
sb.append(nameFragments.get(NameFields.CLATLON));
sb.append(delim);
sb.append("v");
// remove '.' from version string
String version = nameFragments.get(NameFields.VERSION);
version = version.replaceAll("\\.", "");
sb.append(version);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = sb.toString().toLowerCase();
return outFile;
}
/**
* Parse the productName and return the portion of the name corresponding to a given field. Fields
* are assumed separated by "_" in the filename.
*
* @param productName
* @param fieldNum
* @return
*/
@Override
public String getNameFrag(String productName, int fieldNum) {
String[] fields = productName.split("_");
String returnField = "ERROR";
if (fieldNum > fields.length) {
System.out.println(
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField);
} else {
returnField = fields[fieldNum];
}
return returnField;
}
@Override
public String productbaseName(
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
String gsd = "gsd";
String dataSrc = "dataSrc";
Map<NameFields, String> nameFragments = new HashMap<NameFields, String>();
// data type
String productType = altwgProduct.getFileFrag();
nameFragments.put(NameFields.DATATYPE, productType);
// product version
String prodVer = getVersion(hdrBuilder);
nameFragments.put(NameFields.VERSION, prodVer);
// extract ground sample distance. gsdD is in mm!
double gsdD = gsdFromHdr(hdrBuilder);
int gsdI = (int) Math.round(gsdD);
String fileUnits = "mm";
gsd = String.format("%05d", gsdI) + fileUnits;
nameFragments.put(NameFields.GSD, gsd);
// data source
HeaderTag key = HeaderTag.DATASRC;
if (hdrBuilder.containsKey(key)) {
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
// check whether dataSrc needs to be modified
dataSrc = HeaderTag.getSDP(dataSrc);
// data source should only be 3 chars long
if (dataSrc.length() > 3) {
dataSrc = dataSrc.substring(0, 3);
}
}
nameFragments.put(NameFields.DATASRC, dataSrc);
// center lon
key = HeaderTag.CLON;
String cLon = null;
if (hdrBuilder.containsKey(key)) {
cLon = hdrBuilder.getCard(key).getValue();
}
if (cLon == null) {
if (isGlobal) {
// set center longitude to 0.0 if value not parsed and this is a global product
cLon = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLON from fits header!";
throw new RuntimeException(errMesg);
}
}
// center lat
key = HeaderTag.CLAT;
String cLat = null;
if (hdrBuilder.containsKey(key)) {
cLat = hdrBuilder.getCard(key).getValue();
}
if (cLat == null) {
if (isGlobal) {
// set center latitude to 0.0 if value not parsed and this is a global product
cLat = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLAT from fits header!";
throw new RuntimeException(errMesg);
}
}
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
nameFragments.put(NameFields.CLATLON, clahLon);
// region
String region = "l";
if (isGlobal) {
region = "g";
}
nameFragments.put(NameFields.REGION, region);
// target body
key = HeaderTag.TARGET;
String tBody = "unkn";
if (hdrBuilder.containsKey(key)) {
tBody = hdrBuilder.getCard(key).getValue();
tBody = getBodyStFrag(tBody);
}
nameFragments.put(NameFields.TBODY, tBody);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = DartNamer.getBaseName(nameFragments);
return outFile;
}
@Override
public String getVersion(FitsHdrBuilder hdrBuilder) {
String prodVer = "prodVer";
// note: this has been changed to MAP_VER in the SIS
HeaderTag key = HeaderTag.MAP_VER;
// key = HeaderTag.PRODVERS;
if (hdrBuilder.containsKey(key)) {
prodVer = hdrBuilder.getCard(key).getValue();
prodVer = prodVer.replaceAll("\\.", "");
}
return prodVer;
}
@Override
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
String gsd = "gsd";
double gsdD = Double.NaN;
// extract ground sample distance using GSD first
HeaderTag key = HeaderTag.GSD;
if (hdrBuilder.containsKey(key)) {
gsd = hdrBuilder.getCard(key).getValue();
gsdD = StringUtil.parseSafeD(gsd);
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSD not set!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// could not parse GSD into valid number, try GSDI
key = HeaderTag.GSDI;
if (hdrBuilder.containsKey(key)) {
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSDI not set!");
// data source should only be 3 characters long
String dataSource = nameFragments.get(NameFields.DATASRC);
if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3);
System.out.println("Will set data source to:"
+ dataSource
+ " but"
+ " this might NOT conform to the ALTWG naming convention!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999
System.out.println(
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999");
gsdD = -999D;
}
sb.append(dataSource);
sb.append(delim);
sb.append(nameFragments.get(NameFields.DATATYPE));
sb.append(delim);
sb.append(nameFragments.get(NameFields.TBODY));
sb.append(delim);
sb.append(nameFragments.get(NameFields.CLATLON));
sb.append(delim);
sb.append("v");
// remove '.' from version string
String version = nameFragments.get(NameFields.VERSION);
version = version.replaceAll("\\.", "");
sb.append(version);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = sb.toString().toLowerCase();
return outFile;
}
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
/**
* Parse the productName and return the portion of the name corresponding to a given field. Fields
* are assumed separated by "_" in the filename.
*
* @param productName
* @param fieldNum
* @return
*/
@Override
public String getNameFrag(String productName, int fieldNum) {
// mandated to use mm! change the units
gsdD = gsdD * 10.0D;
String[] fields = productName.split("_");
String returnField = "ERROR";
if (fieldNum > fields.length) {
System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField);
} else {
returnField = fields[fieldNum];
}
return returnField;
}
return gsdD;
}
@Override
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
@Override
public NameConvention getNameConvention() {
return NameConvention.DARTPRODUCT;
}
String gsd = "gsd";
String dataSrc = "dataSrc";
/**
* Parse target body string to get the proper string fragment for the target body name.
*
* @param tBody
* @return
*/
private String getBodyStFrag(String tBody) {
Map<NameFields, String> nameFragments = new HashMap<NameFields, String>();
String returnFrag = tBody;
// data type
String productType = altwgProduct.getFileFrag();
nameFragments.put(NameFields.DATATYPE, productType);
if (tBody.toLowerCase().contains("didy")) {
returnFrag = "didy";
} else {
if (tBody.toLowerCase().contains("dimo")) {
returnFrag = "dimo";
} else {
System.out.println("Could not parse target string fragment from:" + tBody);
}
// product version
String prodVer = getVersion(hdrBuilder);
nameFragments.put(NameFields.VERSION, prodVer);
// extract ground sample distance. gsdD is in mm!
double gsdD = gsdFromHdr(hdrBuilder);
int gsdI = (int) Math.round(gsdD);
String fileUnits = "mm";
gsd = String.format("%05d", gsdI) + fileUnits;
nameFragments.put(NameFields.GSD, gsd);
// data source
HeaderTag key = HeaderTag.DATASRC;
if (hdrBuilder.containsKey(key)) {
dataSrc = hdrBuilder.getCard(key).getValue().toLowerCase();
// check whether dataSrc needs to be modified
dataSrc = HeaderTag.getSDP(dataSrc);
// data source should only be 3 chars long
if (dataSrc.length() > 3) {
dataSrc = dataSrc.substring(0, 3);
}
}
nameFragments.put(NameFields.DATASRC, dataSrc);
// center lon
key = HeaderTag.CLON;
String cLon = null;
if (hdrBuilder.containsKey(key)) {
cLon = hdrBuilder.getCard(key).getValue();
}
if (cLon == null) {
if (isGlobal) {
// set center longitude to 0.0 if value not parsed and this is a global product
cLon = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLON from fits header!";
throw new RuntimeException(errMesg);
}
}
// center lat
key = HeaderTag.CLAT;
String cLat = null;
if (hdrBuilder.containsKey(key)) {
cLat = hdrBuilder.getCard(key).getValue();
}
if (cLat == null) {
if (isGlobal) {
// set center latitude to 0.0 if value not parsed and this is a global product
cLat = "0.0";
} else {
String errMesg = "ERROR! Could not parse CLAT from fits header!";
throw new RuntimeException(errMesg);
}
}
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
nameFragments.put(NameFields.CLATLON, clahLon);
// region
String region = "l";
if (isGlobal) {
region = "g";
}
nameFragments.put(NameFields.REGION, region);
// target body
key = HeaderTag.TARGET;
String tBody = "unkn";
if (hdrBuilder.containsKey(key)) {
tBody = hdrBuilder.getCard(key).getValue();
tBody = getBodyStFrag(tBody);
}
nameFragments.put(NameFields.TBODY, tBody);
// pds likes having filenames all in the same case, so chose lowercase
String outFile = DartNamer.getBaseName(nameFragments);
return outFile;
}
return returnFrag;
}
@Override
public String getVersion(FitsHdrBuilder hdrBuilder) {
private static void validateMap(Map<NameFields, String> nameFragments) {
String prodVer = "prodVer";
NameFields[] reqFields = new NameFields[7];
reqFields[0] = NameFields.REGION;
reqFields[1] = NameFields.GSD;
reqFields[2] = NameFields.DATASRC;
reqFields[3] = NameFields.DATATYPE;
reqFields[4] = NameFields.TBODY;
reqFields[5] = NameFields.CLATLON;
reqFields[6] = NameFields.VERSION;
// note: this has been changed to MAP_VER in the SIS
HeaderTag key = HeaderTag.MAP_VER;
// key = HeaderTag.PRODVERS;
if (hdrBuilder.containsKey(key)) {
prodVer = hdrBuilder.getCard(key).getValue();
prodVer = prodVer.replaceAll("\\.", "");
}
for (NameFields requiredField : reqFields) {
if (!nameFragments.containsKey(requiredField)) {
String errMesg = "ERROR! Missing required field:" + requiredField.toString();
throw new RuntimeException(errMesg);
}
return prodVer;
}
}
@Override
/** Parse the filename using the DART naming convention and return the GSD value. */
public double gsdFromFilename(String filename) {
@Override
public double gsdFromHdr(FitsHdrBuilder hdrBuilder) {
String[] splitStr = filename.split("_");
// GSD is second element
String gsd = splitStr[1];
gsd = gsd.replace("mm", "");
return StringUtil.parseSafeD(gsd);
}
String gsd = "gsd";
double gsdD = Double.NaN;
// extract ground sample distance using GSD first
HeaderTag key = HeaderTag.GSD;
if (hdrBuilder.containsKey(key)) {
gsd = hdrBuilder.getCard(key).getValue();
gsdD = StringUtil.parseSafeD(gsd);
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSD not set!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// could not parse GSD into valid number, try GSDI
key = HeaderTag.GSDI;
if (hdrBuilder.containsKey(key)) {
gsdD = StringUtil.parseSafeD(hdrBuilder.getCard(key).getValue());
if (gsdD < 0D) {
// keyword value not initialized
gsdD = Double.NaN;
System.out.println("WARNING! keyword GSDI not set!");
}
} else {
System.out.println("could not find " + key.toString() + " to parse GSD from.");
}
if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999
System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999");
gsdD = -999D;
}
}
if (hdrBuilder.getCard(key).getComment().contains("[cm]")) {
// mandated to use mm! change the units
gsdD = gsdD * 10.0D;
}
return gsdD;
}
@Override
public NameConvention getNameConvention() {
return NameConvention.DARTPRODUCT;
}
/**
* Parse target body string to get the proper string fragment for the target body name.
*
* @param tBody
* @return
*/
private String getBodyStFrag(String tBody) {
String returnFrag = tBody;
if (tBody.toLowerCase().contains("didy")) {
returnFrag = "didy";
} else {
if (tBody.toLowerCase().contains("dimo")) {
returnFrag = "dimo";
} else {
System.out.println("Could not parse target string fragment from:" + tBody);
}
}
return returnFrag;
}
private static void validateMap(Map<NameFields, String> nameFragments) {
NameFields[] reqFields = new NameFields[7];
reqFields[0] = NameFields.REGION;
reqFields[1] = NameFields.GSD;
reqFields[2] = NameFields.DATASRC;
reqFields[3] = NameFields.DATATYPE;
reqFields[4] = NameFields.TBODY;
reqFields[5] = NameFields.CLATLON;
reqFields[6] = NameFields.VERSION;
for (NameFields requiredField : reqFields) {
if (!nameFragments.containsKey(requiredField)) {
String errMesg = "ERROR! Missing required field:" + requiredField.toString();
throw new RuntimeException(errMesg);
}
}
}
@Override
/** Parse the filename using the DART naming convention and return the GSD value. */
public double gsdFromFilename(String filename) {
String[] splitStr = filename.split("_");
// GSD is second element
String gsd = splitStr[1];
gsd = gsd.replace("mm", "");
return StringUtil.parseSafeD(gsd);
}
}

View File

@@ -24,25 +24,27 @@ package terrasaur.altwg.pipeline;
/**
* Enum to store the different types of naming conventions.
*
*
* @author espirrc1
*
*/
public enum NameConvention {
ALTPRODUCT,
ALTNFTMLN,
DARTPRODUCT,
NOMATCH,
NONEUSED;
ALTPRODUCT, ALTNFTMLN, DARTPRODUCT, NOMATCH, NONEUSED;
public static NameConvention parseNameConvention(String name) {
for (NameConvention nameConvention : values()) {
if (nameConvention.toString().toLowerCase().equals(name.toLowerCase())) {
System.out.println("parsed naming convention:" + nameConvention.toString());
public static NameConvention parseNameConvention(String name) {
for (NameConvention nameConvention : values()) {
if (nameConvention.toString().toLowerCase().equals(name.toLowerCase())) {
System.out.println("parsed naming convention:" + nameConvention.toString());
return nameConvention;
}
}
NameConvention nameConvention = NameConvention.NOMATCH;
System.out.println("NameConvention.parseNameConvention()" + " could not parse naming convention:" + name
+ ". Returning:" + nameConvention.toString());
return nameConvention;
}
}
NameConvention nameConvention = NameConvention.NOMATCH;
System.out
.println("NameConvention.parseNameConvention()" + " could not parse naming convention:"
+ name + ". Returning:" + nameConvention.toString());
return nameConvention;
}
}

View File

@@ -25,11 +25,16 @@ package terrasaur.altwg.pipeline;
/**
* Enum to describe the different parts of the product name. Used by concrete classes implementing
* ProductNamer.
*
*
* @author espirrc1
*
*/
public enum NameFields {
GSD, DATATYPE, VERSION, DATASRC, CLATLON, REGION, TBODY;
GSD,
DATATYPE,
VERSION,
DATASRC,
CLATLON,
REGION,
TBODY;
}

View File

@@ -35,99 +35,97 @@ import terrasaur.fits.FitsHdr.FitsHdrBuilder;
*/
public class NamingFactory {
public static ProductNamer getNamingConvention(NameConvention namingConvention) {
public static ProductNamer getNamingConvention(NameConvention namingConvention) {
switch (namingConvention) {
case ALTPRODUCT:
return new ALTWGProductNamer();
switch (namingConvention) {
case ALTPRODUCT:
return new ALTWGProductNamer();
case ALTNFTMLN:
return new AltwgMLNNamer();
case ALTNFTMLN:
return new AltwgMLNNamer();
case DARTPRODUCT:
return new DartNamer();
case DARTPRODUCT:
return new DartNamer();
default:
System.err.println(
"ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
throw new RuntimeException();
}
}
/**
* Parse for keyword in pipeline config file that specifies what naming convention to use.
*
* @param pipeConfig
* @return
*/
public static ProductNamer parseNamingConvention(
Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
ProductNamer productNamer = null;
if ((pipeConfig.containsKey(AltPipelnEnum.NAMINGCONVENTION))) {
String value = pipeConfig.get(AltPipelnEnum.NAMINGCONVENTION);
productNamer = parseNamingConvention(value);
} else {
String errMesg = "ERROR! Naming convention should have been defined in pipeConfig!";
throw new RuntimeException(errMesg);
default:
System.err.println("ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
throw new RuntimeException();
}
}
return productNamer;
}
/**
* Parse for keyword in pipeline config file that specifies what naming convention to use.
*
* @param pipeConfig
* @return
*/
public static ProductNamer parseNamingConvention(Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
/**
* Parse string to determine the naming convention to use. Naming convention supplied by
* ProductNamer interface.
*
* @param value
* @return
*/
public static ProductNamer parseNamingConvention(String value) {
ProductNamer productNamer = null;
if (value.length() < 1) {
String errMesg = "ERROR! Cannot pass empty string to NamingFactory.parseNamingConvention!";
throw new RuntimeException(errMesg);
}
NameConvention nameConvention = NameConvention.parseNameConvention(value);
return NamingFactory.getNamingConvention(nameConvention);
}
if ((pipeConfig.containsKey(AltPipelnEnum.NAMINGCONVENTION))) {
String value = pipeConfig.get(AltPipelnEnum.NAMINGCONVENTION);
productNamer = parseNamingConvention(value);
} else {
String errMesg = "ERROR! Naming convention should have been defined in pipeConfig!";
throw new RuntimeException(errMesg);
}
/**
* Given the naming convention, hdrBuilder, productType, and original output file, return the
* renamed output file and cross reference file. Output is returned as a File[] array where
* array[0] is the output basename, array[1] is the cross-reference file. If no naming convention
* is specified (NONEUSED) then array[0] is the same as outfile, array[1] is null.
*
* @param namingConvention
* @param hdrBuilder
* @param productType
* @param isGlobal
* @param outfile - proposed output filename. If Naming convention results in a renamed OBJ then
* this is not used. If no naming convention specified then outputFiles[0] = outfile.
* @return
*/
public static File[] getBaseNameAndCrossRef(
NameConvention namingConvention,
FitsHdrBuilder hdrBuilder,
AltwgDataType productType,
boolean isGlobal,
String outfile) {
File[] outputFiles = new File[2];
// default to no renaming.
File crossrefFile = null;
String basename = outfile;
if (namingConvention != NameConvention.NONEUSED) {
ProductNamer productNamer = NamingFactory.getNamingConvention(namingConvention);
basename = productNamer.productbaseName(hdrBuilder, productType, isGlobal);
crossrefFile = new File(outfile + ".crf");
return productNamer;
}
outputFiles[0] = new File(basename);
outputFiles[1] = crossrefFile;
return outputFiles;
}
/**
* Parse string to determine the naming convention to use. Naming convention supplied by
* ProductNamer interface.
*
* @param value
* @return
*/
public static ProductNamer parseNamingConvention(String value) {
if (value.length() < 1) {
String errMesg = "ERROR! Cannot pass empty string to NamingFactory.parseNamingConvention!";
throw new RuntimeException(errMesg);
}
NameConvention nameConvention = NameConvention.parseNameConvention(value);
return NamingFactory.getNamingConvention(nameConvention);
}
/**
* Given the naming convention, hdrBuilder, productType, and original output file, return the
* renamed output file and cross reference file. Output is returned as a File[] array where
* array[0] is the output basename, array[1] is the cross-reference file. If no naming convention
* is specified (NONEUSED) then array[0] is the same as outfile, array[1] is null.
*
* @param namingConvention
* @param hdrBuilder
* @param productType
* @param isGlobal
* @param outfile - proposed output filename. If Naming convention results in a renamed OBJ then
* this is not used. If no naming convention specified then outputFiles[0] = outfile.
* @return
*/
public static File[] getBaseNameAndCrossRef(
NameConvention namingConvention,
FitsHdrBuilder hdrBuilder,
AltwgDataType productType,
boolean isGlobal,
String outfile) {
File[] outputFiles = new File[2];
// default to no renaming.
File crossrefFile = null;
String basename = outfile;
if (namingConvention != NameConvention.NONEUSED) {
ProductNamer productNamer = NamingFactory.getNamingConvention(namingConvention);
basename = productNamer.productbaseName(hdrBuilder, productType, isGlobal);
crossrefFile = new File(outfile + ".crf");
}
outputFiles[0] = new File(basename);
outputFiles[1] = crossrefFile;
return outputFiles;
}
}

View File

@@ -27,17 +27,15 @@ import terrasaur.fits.FitsHdr.FitsHdrBuilder;
public interface ProductNamer {
public String getNameFrag(String productName, int fieldNum);
public String getNameFrag(String productName, int fieldNum);
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct,
boolean isGlobal);
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal);
public String getVersion(FitsHdrBuilder hdrBuilder);
public String getVersion(FitsHdrBuilder hdrBuilder);
public double gsdFromHdr(FitsHdrBuilder hdrBuilder);
public double gsdFromHdr(FitsHdrBuilder hdrBuilder);
public NameConvention getNameConvention();
public double gsdFromFilename(String filename);
public NameConvention getNameConvention();
public double gsdFromFilename(String filename);
}

View File

@@ -25,7 +25,6 @@ package terrasaur.apps;
import java.io.File;
import java.nio.charset.Charset;
import java.util.*;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
@@ -33,14 +32,14 @@ import org.apache.commons.io.FileUtils;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import spice.basic.Plane;
import spice.basic.Vector3;
import terrasaur.smallBodyModel.BoundingBox;
import terrasaur.smallBodyModel.SmallBodyModel;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.Log4j2Configurator;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataUtil;
import spice.basic.Plane;
import spice.basic.Vector3;
import vtk.vtkGenericCell;
import vtk.vtkPoints;
import vtk.vtkPolyData;
@@ -55,51 +54,45 @@ import vtk.vtksbCellLocator;
*/
public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Adjust vertices of one shape model to lie on the surface of another.";
}
@Override
public String shortDescription() {
return "Adjust vertices of one shape model to lie on the surface of another.";
}
@Override
public String fullDescription(Options options) {
@Override
public String fullDescription(Options options) {
String header =
"""
String header =
"""
\n
This program takes 2 shape models in OBJ format and tries to adjust
to vertices of the first shape model so they lie on the surface of the
second shape model. It does this by shooting a ray starting from the origin
in the direction of each point of the first model into the second model and
then changes the point of the first model to the intersection point.""";
return TerrasaurTool.super.fullDescription(options, header, "");
}
return TerrasaurTool.super.fullDescription(options, header, "");
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("from")
.hasArg()
.desc(
"path to first shape model in OBJ format which will get shifted to the second shape model")
.build());
options.addOption(
Option.builder("to")
.hasArg()
.desc(
"path to second shape model in OBJ format which the first shape model will try to match to")
.build());
options.addOption(
Option.builder("output")
.hasArg()
.desc(
"path to adjusted shape model in OBJ format generated by this program by shifting first to second")
.build());
options.addOption(
Option.builder("filelist")
.desc(
"""
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("from")
.hasArg()
.desc("path to first shape model in OBJ format which will get shifted to the second shape model")
.build());
options.addOption(Option.builder("to")
.hasArg()
.desc("path to second shape model in OBJ format which the first shape model will try to match to")
.build());
options.addOption(Option.builder("output")
.hasArg()
.desc(
"path to adjusted shape model in OBJ format generated by this program by shifting first to second")
.build());
options.addOption(Option.builder("filelist")
.desc(
"""
If specified then the second required argument to this program,
"to" is a file containing a list of OBJ files to match to.
In this situation the ray is shot into each of the shape models in this
@@ -108,204 +101,177 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
list may be only a piece of the the complete shape model (e.g. a mapola).
However, the global shape model formed when all these pieces are
combined together, may not have any holes or gaps.""")
.build());
options.addOption(
Option.builder("fit-plane-radius")
.hasArg()
.desc(
"""
.build());
options.addOption(Option.builder("fit-plane-radius")
.hasArg()
.desc(
"""
If present, find a local normal at each point in the first shape
model by fitting a plane to all points within the specified radius.
Use this normal to adjust the point to the second shape model rather
than the radial vector.""")
.build());
options.addOption(
Option.builder("local")
.desc(
"""
.build());
options.addOption(Option.builder("local")
.desc(
"""
Use when adjusting a local OBJ file to another. The best fit plane to the
first shape model is used to adjust the vertices rather than the radial vector
for each point.
""")
.build());
return options;
}
static Vector3D computeMeanPoint(List<Vector3D> points) {
Vector3D meanPoint = new Vector3D(0., 0., 0.);
for (Vector3D point : points) meanPoint = meanPoint.add(point);
meanPoint = meanPoint.scalarMultiply(1. / points.size());
return meanPoint;
}
public static void adjustShapeModelToOtherShapeModel(
vtkPolyData frompolydata,
ArrayList<vtkPolyData> topolydata,
double planeRadius,
boolean localModel)
throws Exception {
vtkPoints points = frompolydata.GetPoints();
long numberPoints = frompolydata.GetNumberOfPoints();
boolean fitPlane = (planeRadius > 0);
SmallBodyModel sbModel = new SmallBodyModel(frompolydata);
double diagonalLength = new BoundingBox(frompolydata.GetBounds()).getDiagonalLength();
ArrayList<vtksbCellLocator> cellLocators = new ArrayList<>();
for (vtkPolyData polydata : topolydata) {
vtksbCellLocator cellLocator = new vtksbCellLocator();
cellLocator.SetDataSet(polydata);
cellLocator.CacheCellBoundsOn();
cellLocator.AutomaticOn();
cellLocator.BuildLocator();
cellLocators.add(cellLocator);
.build());
return options;
}
vtkGenericCell cell = new vtkGenericCell();
double tol = 1e-6;
double[] t = new double[1];
double[] pcoords = new double[3];
int[] subId = new int[1];
long[] cell_id = new long[1];
double[] localNormal = null;
if (localModel) {
// fit a plane to the local model and check that the normal points outward
Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata);
Vector3 localNormalVector = localPlane.getNormal();
if (localNormalVector.dot(localPlane.getPoint()) < 0)
localNormalVector = localNormalVector.negate();
localNormal = localNormalVector.toArray();
static Vector3D computeMeanPoint(List<Vector3D> points) {
Vector3D meanPoint = new Vector3D(0., 0., 0.);
for (Vector3D point : points) meanPoint = meanPoint.add(point);
meanPoint = meanPoint.scalarMultiply(1. / points.size());
return meanPoint;
}
double[] p = new double[3];
Vector3D origin = new Vector3D(0., 0., 0.);
for (int i = 0; i < numberPoints; ++i) {
points.GetPoint(i, p);
public static void adjustShapeModelToOtherShapeModel(
vtkPolyData frompolydata, ArrayList<vtkPolyData> topolydata, double planeRadius, boolean localModel)
throws Exception {
vtkPoints points = frompolydata.GetPoints();
long numberPoints = frompolydata.GetNumberOfPoints();
Vector3D lookDir;
boolean fitPlane = (planeRadius > 0);
SmallBodyModel sbModel = new SmallBodyModel(frompolydata);
double diagonalLength = new BoundingBox(frompolydata.GetBounds()).getDiagonalLength();
if (fitPlane) {
// fit a plane to the local area
System.arraycopy(p, 0, origin.toArray(), 0, 3);
lookDir = new Vector3D(sbModel.getNormalAtPoint(p, planeRadius)).normalize();
} else if (localModel) {
System.arraycopy(p, 0, origin.toArray(), 0, 3);
lookDir = new Vector3D(localNormal).normalize();
} else {
// use radial vector
lookDir = new Vector3D(p).normalize();
}
ArrayList<vtksbCellLocator> cellLocators = new ArrayList<>();
for (vtkPolyData polydata : topolydata) {
vtksbCellLocator cellLocator = new vtksbCellLocator();
cellLocator.SetDataSet(polydata);
cellLocator.CacheCellBoundsOn();
cellLocator.AutomaticOn();
cellLocator.BuildLocator();
cellLocators.add(cellLocator);
}
Vector3D lookPt = lookDir.scalarMultiply(diagonalLength);
lookPt = lookPt.add(origin);
vtkGenericCell cell = new vtkGenericCell();
double tol = 1e-6;
double[] t = new double[1];
double[] pcoords = new double[3];
int[] subId = new int[1];
long[] cell_id = new long[1];
List<Vector3D> intersections = new ArrayList<>();
for (vtksbCellLocator cellLocator : cellLocators) {
double[] intersectPoint = new double[3];
double[] localNormal = null;
if (localModel) {
// fit a plane to the local model and check that the normal points outward
Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata);
Vector3 localNormalVector = localPlane.getNormal();
if (localNormalVector.dot(localPlane.getPoint()) < 0) localNormalVector = localNormalVector.negate();
localNormal = localNormalVector.toArray();
}
// trace ray from the lookPt to the origin - first intersection is the farthest intersection
// from the origin
int result =
cellLocator.IntersectWithLine(
lookPt.toArray(),
origin.toArray(),
tol,
t,
intersectPoint,
pcoords,
subId,
cell_id,
cell);
Vector3D intersectVector = new Vector3D(intersectPoint);
double[] p = new double[3];
Vector3D origin = new Vector3D(0., 0., 0.);
for (int i = 0; i < numberPoints; ++i) {
points.GetPoint(i, p);
Vector3D thisPoint = new Vector3D(p);
if (fitPlane || localModel) {
// NOTE: result should return 1 in case of intersection but doesn't sometimes.
// Use the norm of intersection point to test for intersection instead.
Vector3D lookDir;
NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>();
if (intersectVector.getNorm() > 0) {
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector);
}
if (fitPlane) {
// fit a plane to the local area
System.arraycopy(p, 0, origin.toArray(), 0, 3);
lookDir = new Vector3D(sbModel.getNormalAtPoint(p, planeRadius)).normalize();
} else if (localModel) {
System.arraycopy(p, 0, origin.toArray(), 0, 3);
lookDir = new Vector3D(localNormal).normalize();
} else {
// use radial vector
lookDir = new Vector3D(p).normalize();
}
lookPt = lookDir.scalarMultiply(-diagonalLength);
lookPt = lookPt.add(origin);
result =
cellLocator.IntersectWithLine(
lookPt.toArray(),
origin.toArray(),
tol,
t,
intersectPoint,
pcoords,
subId,
cell_id,
cell);
Vector3D lookPt = lookDir.scalarMultiply(diagonalLength);
lookPt = lookPt.add(thisPoint);
intersectVector = new Vector3D(intersectPoint);
if (intersectVector.getNorm() > 0) {
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector);
}
List<Vector3D> intersections = new ArrayList<>();
for (vtksbCellLocator cellLocator : cellLocators) {
double[] intersectPoint = new double[3];
if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey()));
// trace ray from thisPoint to the lookPt - Assume cell intersection is the closest one if
// there are multiple?
// NOTE: result should return 1 in case of intersection but doesn't sometimes.
// Use the norm of intersection point to test for intersection instead.
int result = cellLocator.IntersectWithLine(
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
Vector3D intersectVector = new Vector3D(intersectPoint);
NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>();
if (intersectVector.getNorm() > 0) {
pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
}
// look in the other direction
lookPt = lookDir.scalarMultiply(-diagonalLength);
lookPt = lookPt.add(thisPoint);
result = cellLocator.IntersectWithLine(
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
intersectVector = new Vector3D(intersectPoint);
if (intersectVector.getNorm() > 0) {
pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
}
if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey()));
}
if (intersections.isEmpty()) throw new Exception("Error: no intersections at all");
Vector3D meanIntersectionPoint = computeMeanPoint(intersections);
points.SetPoint(i, meanIntersectionPoint.toArray());
}
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new AdjustShapeModelToOtherShapeModel();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean loadListFromFile = cl.hasOption("filelist");
double planeRadius = Double.parseDouble(cl.getOptionValue("fit-plane-radius", "-1"));
boolean localModel = cl.hasOption("local");
NativeLibraryLoader.loadVtkLibraries();
String fromfile = cl.getOptionValue("from");
String tofile = cl.getOptionValue("to");
String outfile = cl.getOptionValue("output");
Log4j2Configurator.getInstance();
logger.info("loading <from-obj-file>: {}", fromfile);
vtkPolyData frompolydata = PolyDataUtil.loadShapeModelAndComputeNormals(fromfile);
ArrayList<vtkPolyData> topolydata = new ArrayList<>();
if (loadListFromFile) {
List<String> lines = FileUtils.readLines(new File(tofile), Charset.defaultCharset());
for (String file : lines) {
// checking length prevents trying to load an empty line, such as the
// last line of the file.
if (file.length() > 1) {
logger.info("loading <to-obj-file>: {}", file);
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(file));
}
}
} else {
if (result > 0) intersections.add(intersectVector);
logger.info("loading <to-obj-file>: {}", tofile);
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(tofile));
}
}
if (intersections.isEmpty()) throw new Exception("Error: no intersections at all");
adjustShapeModelToOtherShapeModel(frompolydata, topolydata, planeRadius, localModel);
Vector3D meanIntersectionPoint = computeMeanPoint(intersections);
points.SetPoint(i, meanIntersectionPoint.toArray());
PolyDataUtil.saveShapeModelAsOBJ(frompolydata, outfile);
logger.info("wrote {}", outfile);
}
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new AdjustShapeModelToOtherShapeModel();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean loadListFromFile = cl.hasOption("filelist");
double planeRadius = Double.parseDouble(cl.getOptionValue("fit-plane-radius", "-1"));
boolean localModel = cl.hasOption("local");
NativeLibraryLoader.loadVtkLibraries();
String fromfile = cl.getOptionValue("from");
String tofile = cl.getOptionValue("to");
String outfile = cl.getOptionValue("output");
Log4j2Configurator.getInstance();
logger.info("loading <from-obj-file>: {}", fromfile);
vtkPolyData frompolydata = PolyDataUtil.loadShapeModelAndComputeNormals(fromfile);
ArrayList<vtkPolyData> topolydata = new ArrayList<>();
if (loadListFromFile) {
List<String> lines = FileUtils.readLines(new File(tofile), Charset.defaultCharset());
for (String file : lines) {
// checking length prevents trying to load an empty line, such as the
// last line of the file.
if (file.length() > 1) {
logger.info("loading <to-obj-file>: {}", file);
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(file));
}
}
} else {
logger.info("loading <to-obj-file>: {}", tofile);
topolydata.add(PolyDataUtil.loadShapeModelAndComputeNormals(tofile));
}
adjustShapeModelToOtherShapeModel(frompolydata, topolydata, planeRadius, localModel);
PolyDataUtil.saveShapeModelAsOBJ(frompolydata, outfile);
logger.info("wrote {}", outfile);
}
}

View File

@@ -44,121 +44,117 @@ import vtk.vtkPolyData;
*/
public class AppendOBJ implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Combine multiple shape files (OBJ or VTK format) into one.";
}
@Override
public String fullDescription(Options options) {
String header = "This program combines input shape models into a single shape model.";
return TerrasaurTool.super.fullDescription(options, header, "");
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("boundary")
.desc("Only save out boundary. This option implies -vtk.")
.build());
options.addOption(
Option.builder("decimate")
.hasArg()
.desc(
"Reduce the number of facets in the output shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and <arg> is 0.90, "
+ "there will be approximately 10 facets after the decimation.")
.build());
options.addOption(
Option.builder("input")
.required()
.hasArgs()
.desc(
"input file(s) to read. Format is derived from the allowed extension: "
+ "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified "
+ "with a single -input option, separated by whitespace. Alternatively, you may "
+ "specify -input multiple times.")
.build());
options.addOption(
Option.builder("output").required().hasArg().desc("output file to write.").build());
options.addOption(
Option.builder("vtk").desc("Save output file in VTK format rather than OBJ.").build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new AppendOBJ();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean boundaryOnly = cl.hasOption("boundary");
boolean vtkFormat = boundaryOnly || cl.hasOption("vtk");
boolean decimate = cl.hasOption("decimate");
double decimationPercentage =
decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
NativeLibraryLoader.loadVtkLibraries();
String outfile = cl.getOptionValue("output");
String[] infiles = cl.getOptionValues("input");
vtkAppendPolyData append = new vtkAppendPolyData();
append.UserManagedInputsOn();
append.SetNumberOfInputs(infiles.length);
for (int i = 0; i < infiles.length; ++i) {
logger.info("loading {} {} / {}", infiles[i], i + 1, infiles.length);
vtkPolyData polydata = PolyDataUtil.loadShapeModel(infiles[i]);
if (polydata == null) {
logger.warn("Cannot load {}", infiles[i]);
} else {
if (boundaryOnly) {
vtkPolyData boundary = PolyDataUtil.getBoundary(polydata);
boundary.GetCellData().SetScalars(null);
polydata.DeepCopy(boundary);
}
append.SetInputDataByNumber(i, polydata);
}
System.gc();
vtkObjectBase.JAVA_OBJECT_MANAGER.gc(false);
@Override
public String shortDescription() {
return "Combine multiple shape files (OBJ or VTK format) into one.";
}
append.Update();
@Override
public String fullDescription(Options options) {
String header = "This program combines input shape models into a single shape model.";
return TerrasaurTool.super.fullDescription(options, header, "");
}
vtkPolyData outputShape = append.GetOutput();
if (decimate) PolyDataUtil.decimatePolyData(outputShape, decimationPercentage);
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
if (vtkFormat) PolyDataUtil.saveShapeModelAsVTK(outputShape, outfile);
else PolyDataUtil.saveShapeModelAsOBJ(append.GetOutput(), outfile);
options.addOption(Option.builder("boundary")
.desc("Only save out boundary. This option implies -vtk.")
.build());
options.addOption(Option.builder("decimate")
.hasArg()
.desc(
"Reduce the number of facets in the output shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and <arg> is 0.90, "
+ "there will be approximately 10 facets after the decimation.")
.build());
options.addOption(Option.builder("input")
.required()
.hasArgs()
.desc("input file(s) to read. Format is derived from the allowed extension: "
+ "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified "
+ "with a single -input option, separated by whitespace. Alternatively, you may "
+ "specify -input multiple times.")
.build());
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("output file to write.")
.build());
options.addOption(Option.builder("vtk")
.desc("Save output file in VTK format rather than OBJ.")
.build());
return options;
}
logger.info("Wrote " + outfile);
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new AppendOBJ();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean boundaryOnly = cl.hasOption("boundary");
boolean vtkFormat = boundaryOnly || cl.hasOption("vtk");
boolean decimate = cl.hasOption("decimate");
double decimationPercentage = decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
NativeLibraryLoader.loadVtkLibraries();
String outfile = cl.getOptionValue("output");
String[] infiles = cl.getOptionValues("input");
vtkAppendPolyData append = new vtkAppendPolyData();
append.UserManagedInputsOn();
append.SetNumberOfInputs(infiles.length);
for (int i = 0; i < infiles.length; ++i) {
logger.info("loading {} {} / {}", infiles[i], i + 1, infiles.length);
vtkPolyData polydata = PolyDataUtil.loadShapeModel(infiles[i]);
if (polydata == null) {
logger.warn("Cannot load {}", infiles[i]);
} else {
if (boundaryOnly) {
vtkPolyData boundary = PolyDataUtil.getBoundary(polydata);
boundary.GetCellData().SetScalars(null);
polydata.DeepCopy(boundary);
}
append.SetInputDataByNumber(i, polydata);
}
System.gc();
vtkObjectBase.JAVA_OBJECT_MANAGER.gc(false);
}
append.Update();
vtkPolyData outputShape = append.GetOutput();
if (decimate) PolyDataUtil.decimatePolyData(outputShape, decimationPercentage);
if (vtkFormat) PolyDataUtil.saveShapeModelAsVTK(outputShape, outfile);
else PolyDataUtil.saveShapeModelAsOBJ(append.GetOutput(), outfile);
logger.info("Wrote " + outfile);
}
}

View File

@@ -39,72 +39,65 @@ import terrasaur.utils.batch.GridType;
public class BatchSubmit implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Run a command on a cluster.";
}
@Override
public String fullDescription(Options options) {
String footer = "\nRun a command on a cluster.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("command")
.required()
.hasArgs()
.desc("Required. Command(s) to run.")
.build());
StringBuilder sb = new StringBuilder();
for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name()));
options.addOption(
Option.builder("gridType")
.hasArg()
.desc("Grid type. Valid values are " + sb + ". Default is LOCAL.")
.build());
options.addOption(
Option.builder("workingDir")
.hasArg()
.desc("Working directory to run command. Default is current directory.")
.build());
return options;
@Override
public String shortDescription() {
return "Run a command on a cluster.";
}
public static void main(String[] args) {
@Override
public String fullDescription(Options options) {
TerrasaurTool defaultOBJ = new BatchSubmit();
String footer = "\nRun a command on a cluster.\n";
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
List<String> cmdList = Arrays.asList(cl.getOptionValues("command"));
BatchType batchType = BatchType.GRID_ENGINE;
GridType gridType =
cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType);
String workingDir = "";
try {
submitter.runBatchSubmitinDir(workingDir);
} catch (InterruptedException | IOException e) {
logger.error(e.getLocalizedMessage(), e);
return TerrasaurTool.super.fullDescription(options, "", footer);
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("command")
.required()
.hasArgs()
.desc("Required. Command(s) to run.")
.build());
StringBuilder sb = new StringBuilder();
for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name()));
options.addOption(Option.builder("gridType")
.hasArg()
.desc("Grid type. Valid values are " + sb + ". Default is LOCAL.")
.build());
options.addOption(Option.builder("workingDir")
.hasArg()
.desc("Working directory to run command. Default is current directory.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new BatchSubmit();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
List<String> cmdList = Arrays.asList(cl.getOptionValues("command"));
BatchType batchType = BatchType.GRID_ENGINE;
GridType gridType = cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType);
String workingDir = "";
try {
submitter.runBatchSubmitinDir(workingDir);
} catch (InterruptedException | IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
}
}

View File

@@ -64,35 +64,46 @@ import terrasaur.utils.math.MathConversions;
public class CKFromSumFile implements TerrasaurTool {
private final static Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Create a CK from a list of sumfiles.";
}
@Override
public String shortDescription() {
return "Create a CK from a list of sumfiles.";
}
@Override
public String fullDescription(Options options) {
String footer = "Create a CK from a list of sumfiles.";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
@Override
public String fullDescription(Options options) {
String footer = "Create a CK from a list of sumfiles.";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("config").required().hasArg()
.desc("Required. Name of configuration file.").build());
options.addOption(Option.builder("dumpConfig").hasArg()
.desc("Write out an example configuration to the named file.").build());
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("sumFile").hasArg().required().desc("""
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("config")
.required()
.hasArg()
.desc("Required. Name of configuration file.")
.build());
options.addOption(Option.builder("dumpConfig")
.hasArg()
.desc("Write out an example configuration to the named file.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("sumFile")
.hasArg()
.required()
.desc(
"""
Required. File listing sumfiles to read. This is a text file, one per line.
Lines starting with # are ignored.
@@ -106,212 +117,212 @@ public class CKFromSumFile implements TerrasaurTool {
D717506131G0.SUM
# This is a comment
D717506132G0.SUM
""").build());
return options;
}
private final CKFromSumFileConfig config;
private final NavigableMap<SumFile, String> sumFiles;
private CKFromSumFile(){config=null;sumFiles=null;}
public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) {
this.config = config;
this.sumFiles = sumFiles;
}
public String writeMSOPCKFiles(String basename, List<String> comments) throws SpiceException {
ReferenceFrame instrFrame = new ReferenceFrame(config.instrumentFrameName());
ReferenceFrame scFrame = new ReferenceFrame(config.spacecraftFrame());
ReferenceFrame j2000 = new ReferenceFrame("J2000");
ReferenceFrame bodyFixed = new ReferenceFrame(config.bodyFrame());
ReferenceFrame ref = config.J2000() ? j2000 : bodyFixed;
logger.debug("Body fixed frame: {}", bodyFixed.getName());
logger.debug("Instrument frame: {}", instrFrame.getName());
logger.debug("Spacecraft frame: {}", scFrame.getName());
logger.debug(" Reference frame: {}", ref.getName());
File commentFile = new File(basename + "-comments.txt");
if (commentFile.exists())
if (!commentFile.delete())
logger.error("{} exists but cannot be deleted!", commentFile.getPath());
String setupFile = basename + ".setup";
String inputFile = basename + ".inp";
try (PrintWriter pw = new PrintWriter(commentFile)) {
StringBuilder sb = new StringBuilder();
DateTimeFormatter dtf = DateTimeFormatter.ofPattern("uuuu-MM-dd HH:mm:ss z");
ZonedDateTime now = ZonedDateTime.now(ZoneId.of("UTC"));
sb.append("This CK was created on ").append(dtf.format(now)).append(" from the following sumFiles:\n");
for (SumFile sumFile : sumFiles.keySet()) {
sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile)));
}
sb.append("\n");
sb.append("providing the orientation of ").append(scFrame.getName()).append(" with respect to ").append(config.J2000() ? "J2000" : bodyFixed.getName()).append(". ");
double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds();
double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend();
sb.append("The coverage period is ").append(new TDBTime(first).toUTCString("ISOC", 3)).append(" to ").append(new TDBTime(last).toUTCString("ISOC", 3)).append(" UTC.");
String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n"))
pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
""")
.build());
return options;
}
Map<String, String> map = new TreeMap<>();
map.put("LSK_FILE_NAME", "'" + config.lsk() + "'");
map.put("SCLK_FILE_NAME", "'" + config.sclk() + "'");
map.put("CK_TYPE", "3");
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode()));
map.put("REFERENCE_FRAME_NAME",
String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
if (!config.fk().isEmpty())
map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
map.put("INPUT_TIME_TYPE", "'UTC'");
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
private final CKFromSumFileConfig config;
private final NavigableMap<SumFile, String> sumFiles;
try (PrintWriter pw = new PrintWriter(setupFile)) {
pw.println("\\begindata");
for (String key : map.keySet()) {
pw.printf("%s = %s\n", key, map.get(key));
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
private CKFromSumFile() {
config = null;
sumFiles = null;
}
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
for (SumFile s : sumFiles.keySet()) {
TDBTime t = new TDBTime(s.utcString());
Vector3[] rows = new Vector3[3];
rows[0] = MathConversions.toVector3(s.cx());
rows[1] = MathConversions.toVector3(s.cy());
rows[2] = MathConversions.toVector3(s.cz());
Vector3 row0 = rows[Math.abs(config.flipX()) - 1];
Vector3 row1 = rows[Math.abs(config.flipY()) - 1];
Vector3 row2 = rows[Math.abs(config.flipZ()) - 1];
if (config.flipX() < 0)
row0 = row0.negate();
if (config.flipY() < 0)
row1 = row1.negate();
if (config.flipZ() < 0)
row2 = row2.negate();
Matrix33 refToInstr = new Matrix33(row0, row1, row2);
if (config.J2000()) {
Matrix33 j2000ToBodyFixed = j2000.getPositionTransformation(bodyFixed, t);
refToInstr = refToInstr.mxm(j2000ToBodyFixed);
}
Matrix33 instrToSc = instrFrame.getPositionTransformation(scFrame, t);
Matrix33 refToSc = instrToSc.mxm(refToInstr);
SpiceQuaternion q = new SpiceQuaternion(refToSc);
attitudeMap.put(t.getTDBSeconds(), q);
public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) {
this.config = config;
this.sumFiles = sumFiles;
}
if (config.extend() > 0) {
var lastEntry = attitudeMap.lastEntry();
attitudeMap.put(lastEntry.getKey() + config.extend(), lastEntry.getValue());
public String writeMSOPCKFiles(String basename, List<String> comments) throws SpiceException {
ReferenceFrame instrFrame = new ReferenceFrame(config.instrumentFrameName());
ReferenceFrame scFrame = new ReferenceFrame(config.spacecraftFrame());
ReferenceFrame j2000 = new ReferenceFrame("J2000");
ReferenceFrame bodyFixed = new ReferenceFrame(config.bodyFrame());
ReferenceFrame ref = config.J2000() ? j2000 : bodyFixed;
logger.debug("Body fixed frame: {}", bodyFixed.getName());
logger.debug("Instrument frame: {}", instrFrame.getName());
logger.debug("Spacecraft frame: {}", scFrame.getName());
logger.debug(" Reference frame: {}", ref.getName());
File commentFile = new File(basename + "-comments.txt");
if (commentFile.exists())
if (!commentFile.delete()) logger.error("{} exists but cannot be deleted!", commentFile.getPath());
String setupFile = basename + ".setup";
String inputFile = basename + ".inp";
try (PrintWriter pw = new PrintWriter(commentFile)) {
StringBuilder sb = new StringBuilder();
DateTimeFormatter dtf = DateTimeFormatter.ofPattern("uuuu-MM-dd HH:mm:ss z");
ZonedDateTime now = ZonedDateTime.now(ZoneId.of("UTC"));
sb.append("This CK was created on ").append(dtf.format(now)).append(" from the following sumFiles:\n");
for (SumFile sumFile : sumFiles.keySet()) {
sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile)));
}
sb.append("\n");
sb.append("providing the orientation of ")
.append(scFrame.getName())
.append(" with respect to ")
.append(config.J2000() ? "J2000" : bodyFixed.getName())
.append(". ");
double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds();
double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend();
sb.append("The coverage period is ")
.append(new TDBTime(first).toUTCString("ISOC", 3))
.append(" to ")
.append(new TDBTime(last).toUTCString("ISOC", 3))
.append(" UTC.");
String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
Map<String, String> map = new TreeMap<>();
map.put("LSK_FILE_NAME", "'" + config.lsk() + "'");
map.put("SCLK_FILE_NAME", "'" + config.sclk() + "'");
map.put("CK_TYPE", "3");
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode()));
map.put("REFERENCE_FRAME_NAME", String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
if (!config.fk().isEmpty()) map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
map.put("INPUT_TIME_TYPE", "'UTC'");
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
try (PrintWriter pw = new PrintWriter(setupFile)) {
pw.println("\\begindata");
for (String key : map.keySet()) {
pw.printf("%s = %s\n", key, map.get(key));
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
for (SumFile s : sumFiles.keySet()) {
TDBTime t = new TDBTime(s.utcString());
Vector3[] rows = new Vector3[3];
rows[0] = MathConversions.toVector3(s.cx());
rows[1] = MathConversions.toVector3(s.cy());
rows[2] = MathConversions.toVector3(s.cz());
Vector3 row0 = rows[Math.abs(config.flipX()) - 1];
Vector3 row1 = rows[Math.abs(config.flipY()) - 1];
Vector3 row2 = rows[Math.abs(config.flipZ()) - 1];
if (config.flipX() < 0) row0 = row0.negate();
if (config.flipY() < 0) row1 = row1.negate();
if (config.flipZ() < 0) row2 = row2.negate();
Matrix33 refToInstr = new Matrix33(row0, row1, row2);
if (config.J2000()) {
Matrix33 j2000ToBodyFixed = j2000.getPositionTransformation(bodyFixed, t);
refToInstr = refToInstr.mxm(j2000ToBodyFixed);
}
Matrix33 instrToSc = instrFrame.getPositionTransformation(scFrame, t);
Matrix33 refToSc = instrToSc.mxm(refToInstr);
SpiceQuaternion q = new SpiceQuaternion(refToSc);
attitudeMap.put(t.getTDBSeconds(), q);
}
if (config.extend() > 0) {
var lastEntry = attitudeMap.lastEntry();
attitudeMap.put(lastEntry.getKey() + config.extend(), lastEntry.getValue());
}
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
for (double t : attitudeMap.keySet()) {
SpiceQuaternion q = attitudeMap.get(t);
pw.printf(
"%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6), q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
}
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
for (double t : attitudeMap.keySet()) {
SpiceQuaternion q = attitudeMap.get(t);
pw.printf("%s %.14e %.14e %.14e %.14e\n", new TDBTime(t).toUTCString("ISOC", 6),
q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
public static void main(String[] args) throws SpiceException, IOException {
TerrasaurTool defaultOBJ = new CKFromSumFile();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
if (cl.hasOption("dumpConfig")) {
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
PropertiesConfiguration config = factory.toConfig(factory.getTemplate());
try {
String filename = cl.getOptionValue("dumpConfig");
config.write(new PrintWriter(filename));
logger.info("Wrote {}", filename);
} catch (ConfigurationException | IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
System.exit(0);
}
NativeLibraryLoader.loadSpiceLibraries();
PropertiesConfiguration config = null;
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
try {
config = new Configurations().properties(new File(cl.getOptionValue("config")));
} catch (ConfigurationException e1) {
logger.error(e1.getLocalizedMessage(), e1);
}
CKFromSumFileConfig appConfig = factory.fromConfig(config);
for (String kernel : appConfig.metakernel()) KernelDatabase.load(kernel);
NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> {
try {
return Double.compare(
new TDBTime(o1.utcString()).getTDBSeconds(), new TDBTime(o2.utcString()).getTDBSeconds());
} catch (SpiceErrorException e) {
logger.error(e.getLocalizedMessage(), e);
}
return 0;
});
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
for (String line : lines) {
if (line.strip().startsWith("#")) continue;
String[] parts = line.strip().split("\\s+");
String filename = parts[0].trim();
sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename));
}
CKFromSumFile app = new CKFromSumFile(appConfig, sumFiles);
TDBTime begin = new TDBTime(sumFiles.firstKey().utcString());
TDBTime end = new TDBTime(sumFiles.lastKey().utcString());
String picture = "YYYY_DOY";
String command = app.writeMSOPCKFiles(
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)), new ArrayList<>());
logger.info("To generate the CK, run:\n\t{}", command);
logger.info("Finished.");
}
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
}
public static void main(String[] args) throws SpiceException, IOException {
TerrasaurTool defaultOBJ = new CKFromSumFile();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
if (cl.hasOption("dumpConfig")){
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
PropertiesConfiguration config = factory.toConfig(factory.getTemplate());
try {
String filename = cl.getOptionValue("dumpConfig");
config.write(new PrintWriter(filename));
logger.info("Wrote {}", filename);
} catch (ConfigurationException | IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
System.exit(0);
}
NativeLibraryLoader.loadSpiceLibraries();
PropertiesConfiguration config = null;
CKFromSumFileConfigFactory factory = new CKFromSumFileConfigFactory();
try {
config = new Configurations().properties(new File(cl.getOptionValue("config")));
} catch (ConfigurationException e1) {
logger.error(e1.getLocalizedMessage(), e1);
}
CKFromSumFileConfig appConfig = factory.fromConfig(config);
for (String kernel : appConfig.metakernel())
KernelDatabase.load(kernel);
NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> {
try {
return Double.compare(new TDBTime(o1.utcString()).getTDBSeconds(),
new TDBTime(o2.utcString()).getTDBSeconds());
} catch (SpiceErrorException e) {
logger.error(e.getLocalizedMessage(), e);
}
return 0;
});
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
for (String line : lines) {
if (line.strip().startsWith("#"))
continue;
String[] parts = line.strip().split("\\s+");
String filename = parts[0].trim();
sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename));
}
CKFromSumFile app = new CKFromSumFile(appConfig, sumFiles);
TDBTime begin = new TDBTime(sumFiles.firstKey().utcString());
TDBTime end = new TDBTime(sumFiles.lastKey().utcString());
String picture = "YYYY_DOY";
String command = app.writeMSOPCKFiles(
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)),
new ArrayList<>());
logger.info("To generate the CK, run:\n\t{}", command);
logger.info("Finished.");
}
}

View File

@@ -59,531 +59,511 @@ import vtk.vtkPolyData;
*/
public class ColorSpots implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
private ColorSpots() {}
private ColorSpots() {}
@Override
public String shortDescription() {
return "Assign values to facets in a shape model from an input dataset.";
}
@Override
public String shortDescription() {
return "Assign values to facets in a shape model from an input dataset.";
}
@Override
public String fullDescription(Options options) {
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
String header = "";
String footer =
"""
This program reads an OBJ file along with a CSV file containing locations and values and writes the \
mean value and standard deviation for each facet within a specified distance of an input point to standard \
out. Latitude and longitude are specified in degrees. Longitude is east longitude. Units of x, y, z, and \
radius are the same as the units in the supplied OBJ file.
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private enum FORMAT {
LL,
LLR,
XYZ
}
private enum FIELD {
MIN,
MAX,
MEDIAN,
N,
RMS,
SUM,
STD,
VARIANCE
}
private vtkPolyData polyData;
private SmallBodyModel smallBodyModel;
public ColorSpots(vtkPolyData polyData) {
this.polyData = polyData;
this.smallBodyModel = new SmallBodyModel(polyData);
}
private long getXYZ(double lat, double lon, double[] pt) {
double[] origin = {0., 0., 0.};
Vector3D lookDir = new Vector3D(lon, lat);
return smallBodyModel.computeRayIntersection(origin, lookDir.toArray(), pt);
}
private ArrayList<double[]> readCSV(String filename, FORMAT format) {
ArrayList<double[]> returnArray = new ArrayList<>();
try (Reader in = new FileReader(filename)) {
Iterable<CSVRecord> records = CSVFormat.DEFAULT.parse(in);
for (CSVRecord record : records) {
double[] values = new double[4];
values[3] = Double.NaN;
if (format == FORMAT.LL) {
double lon = Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat = Math.toRadians(Double.parseDouble(record.get(1).trim()));
if (getXYZ(lat, lon, values) < 0) continue;
try {
values[3] = Double.parseDouble(record.get(2));
} catch (NumberFormatException e) {
continue;
}
} else {
if (format == FORMAT.LLR) {
double lon = Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat = Math.toRadians(Double.parseDouble(record.get(1).trim()));
double rad = Double.parseDouble(record.get(2).trim());
Vector3D xyz = new Vector3D(lon, lat).scalarMultiply(rad);
values[0] = xyz.getX();
values[1] = xyz.getY();
values[2] = xyz.getZ();
} else if (format == FORMAT.XYZ) {
values[0] = Double.parseDouble(record.get(0).trim());
values[1] = Double.parseDouble(record.get(1).trim());
values[2] = Double.parseDouble(record.get(2).trim());
}
smallBodyModel.findClosestCell(values);
try {
values[3] = Double.parseDouble(record.get(3).trim());
} catch (NumberFormatException e) {
continue;
}
}
returnArray.add(values);
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
return TerrasaurTool.super.fullDescription(options, header, footer);
}
return returnArray;
}
public TreeMap<Long, DescriptiveStatistics> getStatsFast(
ArrayList<double[]> valuesList, double radius, boolean weight, boolean atVertices) {
return atVertices
? getStatsVertex(valuesList, radius, weight)
: getStatsFacet(valuesList, radius, weight);
}
private TreeMap<Long, DescriptiveStatistics> getStatsVertex(
ArrayList<double[]> valuesList, double radius, boolean weight) {
TreeMap<Long, DescriptiveStatistics> statMap = new TreeMap<>();
for (long i = 0; i < smallBodyModel.getSmallBodyPolyData().GetNumberOfPoints(); i++) {
DescriptiveStatistics stats = new DescriptiveStatistics();
statMap.put(i, stats);
}
// double[] xyz = new double[3];
for (double[] values : valuesList) {
Vector3D xyz = new Vector3D(values[0], values[1], values[2]);
double value = values[3];
vtkIdList pointIDs = new vtkIdList();
smallBodyModel.getPointLocator().FindPointsWithinRadius(radius, xyz.toArray(), pointIDs);
// pointIDs.InsertNextId(smallBodyModel.getPointLocator().FindClosestPoint(xyz));
for (int i = 0; i < pointIDs.GetNumberOfIds(); i++) {
long pointID = pointIDs.GetId(i);
DescriptiveStatistics stats = statMap.get(pointID);
Vector3D p = new Vector3D(smallBodyModel.getSmallBodyPolyData().GetPoint(pointID));
double dist = p.distance(xyz);
// cell center can be farther than radius as long as one point is closer than //
// radius
if (dist < radius) {
double thisValue = value;
if (weight) thisValue *= (1 - dist / radius);
stats.addValue(thisValue);
// if (thisValue < 0)
// System.out.printf("Cell %d dist %f radius %f xyz %f %f %f value %f thisValue %e\n",
// cellID, dist, radius,
// xyz[0], xyz[1], xyz[2],
// value, thisValue);
}
} // point loop
} // values loop
return statMap;
}
private TreeMap<Long, DescriptiveStatistics> getStatsFacet(
ArrayList<double[]> valuesList, double radius, boolean weight) {
TreeMap<Long, DescriptiveStatistics> statMap = new TreeMap<>();
for (long i = 0; i < smallBodyModel.getSmallBodyPolyData().GetNumberOfCells(); i++) {
DescriptiveStatistics stats = new DescriptiveStatistics();
statMap.put(i, stats);
private enum FORMAT {
LL,
LLR,
XYZ
}
for (double[] values : valuesList) {
Vector3D xyz = new Vector3D(values[0], values[1], values[2]);
double value = values[3];
Set<Long> cellIDs = smallBodyModel.findClosestCellsWithinRadius(xyz.toArray(), radius);
// cellIDs.add(smallBodyModel.findClosestCell(xyz));
for (Long cellID : cellIDs) {
DescriptiveStatistics stats = statMap.get(cellID);
TriangularFacet tf = PolyDataUtil.getFacet(polyData, cellID);
Vector3D p = MathConversions.toVector3D(tf.getCenter());
double dist = p.distance(xyz);
// cell center can be farther than radius as long as one point is closer than //
// radius
if (dist < radius) {
double thisValue = value;
if (weight) thisValue *= (1 - dist / radius);
stats.addValue(thisValue);
// if (thisValue < 0)
// System.out.printf("Cell %d dist %f radius %f xyz %f %f %f value %f thisValue %e\n",
// cellID, dist, radius,
// xyz[0], xyz[1], xyz[2],
// value, thisValue);
}
} // cell loop
} // values loop
return statMap;
}
public TreeMap<Integer, DescriptiveStatistics> getStats(
ArrayList<double[]> valuesList, double radius) {
// for each value, store indices of closest cells and distances
TreeMap<Integer, ArrayList<Pair<Long, Double>>> closestCells = new TreeMap<>();
for (int i = 0; i < valuesList.size(); i++) {
double[] values = valuesList.get(i);
Vector3D xyz = new Vector3D(values);
TreeSet<Long> sortedCellIDs =
new TreeSet<>(smallBodyModel.findClosestCellsWithinRadius(values, radius));
sortedCellIDs.add(smallBodyModel.findClosestCell(values));
ArrayList<Pair<Long, Double>> distances = new ArrayList<>();
for (long cellID : sortedCellIDs) {
TriangularFacet tf = PolyDataUtil.getFacet(polyData, cellID);
Vector3D p = MathConversions.toVector3D(tf.getCenter());
double dist = p.distance(xyz);
distances.add(Pair.create(cellID, dist));
}
closestCells.put(i, distances);
private enum FIELD {
MIN,
MAX,
MEDIAN,
N,
RMS,
SUM,
STD,
VARIANCE
}
TreeMap<Integer, DescriptiveStatistics> statMap = new TreeMap<>();
for (int cellID = 0; cellID < polyData.GetNumberOfCells(); cellID++) {
DescriptiveStatistics stats = statMap.get(cellID);
if (stats == null) {
stats = new DescriptiveStatistics();
statMap.put(cellID, stats);
}
private vtkPolyData polyData;
private SmallBodyModel smallBodyModel;
for (int i = 0; i < valuesList.size(); i++) {
double[] values = valuesList.get(i);
public ColorSpots(vtkPolyData polyData) {
this.polyData = polyData;
this.smallBodyModel = new SmallBodyModel(polyData);
}
ArrayList<Pair<Long, Double>> distances = closestCells.get(i);
for (Pair<Long, Double> pair : distances) {
private long getXYZ(double lat, double lon, double[] pt) {
double[] origin = {0., 0., 0.};
Vector3D lookDir = new Vector3D(lon, lat);
if (pair.getFirst().intValue() < cellID) continue;
return smallBodyModel.computeRayIntersection(origin, lookDir.toArray(), pt);
}
if (pair.getFirst().intValue() > cellID) break;
private ArrayList<double[]> readCSV(String filename, FORMAT format) {
if (pair.getFirst().intValue() == cellID) {
double dist = pair.getSecond();
if (dist < radius) {
double thisValue = (1 - dist / radius) * values[3];
stats.addValue(thisValue);
ArrayList<double[]> returnArray = new ArrayList<>();
try (Reader in = new FileReader(filename)) {
Iterable<CSVRecord> records = CSVFormat.DEFAULT.parse(in);
for (CSVRecord record : records) {
double[] values = new double[4];
values[3] = Double.NaN;
if (format == FORMAT.LL) {
double lon = Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat = Math.toRadians(Double.parseDouble(record.get(1).trim()));
if (getXYZ(lat, lon, values) < 0) continue;
try {
values[3] = Double.parseDouble(record.get(2));
} catch (NumberFormatException e) {
continue;
}
} else {
if (format == FORMAT.LLR) {
double lon =
Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat =
Math.toRadians(Double.parseDouble(record.get(1).trim()));
double rad = Double.parseDouble(record.get(2).trim());
Vector3D xyz = new Vector3D(lon, lat).scalarMultiply(rad);
values[0] = xyz.getX();
values[1] = xyz.getY();
values[2] = xyz.getZ();
} else if (format == FORMAT.XYZ) {
values[0] = Double.parseDouble(record.get(0).trim());
values[1] = Double.parseDouble(record.get(1).trim());
values[2] = Double.parseDouble(record.get(2).trim());
}
smallBodyModel.findClosestCell(values);
try {
values[3] = Double.parseDouble(record.get(3).trim());
} catch (NumberFormatException e) {
continue;
}
}
returnArray.add(values);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
}
} // cell loop
return statMap;
}
public static void main(String[] args) {
// run the VTK garbage collector every 30 seconds
vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetScheduleTime(30, TimeUnit.SECONDS);
vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetAutoGarbageCollection(true);
// vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetDebug(true);
try {
ColorSpotsMain(args);
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
return returnArray;
}
vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetAutoGarbageCollection(false);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("additionalFields")
.hasArg()
.desc(
"Specify additional fields to write out. Allowed values are min, max, median, n, rms, sum, std, variance. "
+ "More than one field may be specified in a comma separated list (e.g. "
+ "-additionalFields sum,median,rms). Additional fields will be written out after the mean and std columns.")
.build());
options.addOption(
Option.builder("allFacets")
.desc(
"Report values for all facets in OBJ shape model, even if facet is not within searchRadius "
+ "of any points. Prints NaN if facet not within searchRadius. Default is to only "
+ "print facets which have contributions from input points.")
.build());
options.addOption(
Option.builder("info")
.required()
.hasArg()
.desc(
"Required. Name of CSV file containing value to plot."
+ " Default format is lon, lat, radius, value. See -xyz and -llOnly options for alternate formats.")
.build());
options.addOption(
Option.builder("llOnly").desc("Format of -info file is lon, lat, value.").build());
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("normalize")
.desc(
"Report values per unit area (divide by total area of facets within search ellipse).")
.build());
options.addOption(
Option.builder("noWeight")
.desc("Do not weight points by distance from facet/vertex.")
.build());
options.addOption(
Option.builder("obj")
.required()
.hasArg()
.desc("Required. Name of shape model to read.")
.build());
options.addOption(
Option.builder("outFile")
.hasArg()
.desc("Specify output file to store the output.")
.build());
options.addOption(
Option.builder("searchRadius")
.hasArg()
.desc(
"Each facet will be colored using a weighted average of all points within searchRadius of the facet/vertex. "
+ "If not present, set to sqrt(2)/2 * mean facet edge length.")
.build());
options.addOption(
Option.builder("writeVertices")
.desc(
"Convert output from a per facet to per vertex format. Each line will be of the form"
+ " x, y, z, value, sigma where x, y, z are the vector components of vertex V. "
+ " Default is to only report facetID, facet_value, facet_sigma.")
.build());
options.addOption(
Option.builder("xyz").desc("Format of -info file is x, y, z, value.").build());
return options;
}
public static void ColorSpotsMain(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new ColorSpots();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
final boolean writeVerts = cl.hasOption("writeVertices");
final boolean allFacets = cl.hasOption("allFacets");
final boolean normalize = cl.hasOption("normalize") && !writeVerts;
final boolean weight = !cl.hasOption("noWeight");
FORMAT format = FORMAT.LLR;
for (Option option : cl.getOptions()) {
if (option.getOpt().equals("xyz")) {
format = FORMAT.XYZ;
}
if (option.getOpt().equals("llOnly")) {
format = FORMAT.LL;
}
public TreeMap<Long, DescriptiveStatistics> getStatsFast(
ArrayList<double[]> valuesList, double radius, boolean weight, boolean atVertices) {
return atVertices ? getStatsVertex(valuesList, radius, weight) : getStatsFacet(valuesList, radius, weight);
}
vtkPolyData polyData = PolyDataUtil.loadShapeModelAndComputeNormals(cl.getOptionValue("obj"));
private TreeMap<Long, DescriptiveStatistics> getStatsVertex(
ArrayList<double[]> valuesList, double radius, boolean weight) {
double radius;
if (cl.hasOption("searchRadius")) {
radius = Double.parseDouble(cl.getOptionValue("searchRadius"));
} else {
PolyDataStatistics stats = new PolyDataStatistics(polyData);
radius = stats.getMeanEdgeLength() * Math.sqrt(2) / 2;
logger.info("Using search radius of " + radius);
}
ColorSpots cs = new ColorSpots(polyData);
ArrayList<double[]> infoValues = cs.readCSV(cl.getOptionValue("info"), format);
TreeMap<Long, DescriptiveStatistics> statMap =
cs.getStatsFast(infoValues, radius, weight, writeVerts);
double totalArea = 0;
if (normalize) {
for (int facet = 0; facet < polyData.GetNumberOfCells(); facet++) {
vtkCell cell = polyData.GetCell(facet);
vtkPoints points = cell.GetPoints();
double[] pt0 = points.GetPoint(0);
double[] pt1 = points.GetPoint(1);
double[] pt2 = points.GetPoint(2);
TriangularFacet tf =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = tf.getArea();
totalArea += area;
points.Delete();
cell.Delete();
}
}
ArrayList<FIELD> fields = new ArrayList<>();
if (cl.hasOption("additionalFields")) {
for (String s : cl.getOptionValue("additionalFields").trim().toUpperCase().split(",")) {
for (FIELD f : FIELD.values()) {
if (f.name().equalsIgnoreCase(s)) fields.add(f);
TreeMap<Long, DescriptiveStatistics> statMap = new TreeMap<>();
for (long i = 0; i < smallBodyModel.getSmallBodyPolyData().GetNumberOfPoints(); i++) {
DescriptiveStatistics stats = new DescriptiveStatistics();
statMap.put(i, stats);
}
}
// double[] xyz = new double[3];
for (double[] values : valuesList) {
Vector3D xyz = new Vector3D(values[0], values[1], values[2]);
double value = values[3];
vtkIdList pointIDs = new vtkIdList();
smallBodyModel.getPointLocator().FindPointsWithinRadius(radius, xyz.toArray(), pointIDs);
// pointIDs.InsertNextId(smallBodyModel.getPointLocator().FindClosestPoint(xyz));
for (int i = 0; i < pointIDs.GetNumberOfIds(); i++) {
long pointID = pointIDs.GetId(i);
DescriptiveStatistics stats = statMap.get(pointID);
Vector3D p = new Vector3D(smallBodyModel.getSmallBodyPolyData().GetPoint(pointID));
double dist = p.distance(xyz);
// cell center can be farther than radius as long as one point is closer than //
// radius
if (dist < radius) {
double thisValue = value;
if (weight) thisValue *= (1 - dist / radius);
stats.addValue(thisValue);
// if (thisValue < 0)
// System.out.printf("Cell %d dist %f radius %f xyz %f %f %f value %f thisValue %e\n",
// cellID, dist, radius,
// xyz[0], xyz[1], xyz[2],
// value, thisValue);
}
} // point loop
} // values loop
return statMap;
}
TreeMap<Long, ArrayList<Double>> map = new TreeMap<>();
long numPoints = (writeVerts ? polyData.GetNumberOfPoints() : polyData.GetNumberOfCells());
for (long index = 0; index < numPoints; index++) {
DescriptiveStatistics stats = statMap.get(index);
ArrayList<Double> values = new ArrayList<>();
if (stats != null) {
values.add(stats.getMean());
values.add(stats.getStandardDeviation());
for (FIELD f : fields) {
if (f == FIELD.MIN) values.add(stats.getMin());
if (f == FIELD.MAX) values.add(stats.getMax());
if (f == FIELD.MEDIAN) values.add(stats.getPercentile(50));
if (f == FIELD.N) values.add((double) stats.getN());
if (f == FIELD.RMS) values.add(Math.sqrt(stats.getSumsq() / stats.getN()));
if (f == FIELD.STD) values.add(stats.getStandardDeviation());
if (f == FIELD.SUM) values.add(stats.getSum());
if (f == FIELD.VARIANCE) values.add(stats.getVariance());
private TreeMap<Long, DescriptiveStatistics> getStatsFacet(
ArrayList<double[]> valuesList, double radius, boolean weight) {
TreeMap<Long, DescriptiveStatistics> statMap = new TreeMap<>();
for (long i = 0; i < smallBodyModel.getSmallBodyPolyData().GetNumberOfCells(); i++) {
DescriptiveStatistics stats = new DescriptiveStatistics();
statMap.put(i, stats);
}
} else {
values.add(Double.NaN);
values.add(Double.NaN);
for (FIELD f : fields) {
if (f == FIELD.MIN) values.add(Double.NaN);
if (f == FIELD.MAX) values.add(Double.NaN);
if (f == FIELD.MEDIAN) values.add(Double.NaN);
if (f == FIELD.N) values.add(Double.NaN);
if (f == FIELD.RMS) values.add(Double.NaN);
if (f == FIELD.STD) values.add(Double.NaN);
if (f == FIELD.SUM) values.add(Double.NaN);
if (f == FIELD.VARIANCE) values.add(Double.NaN);
for (double[] values : valuesList) {
Vector3D xyz = new Vector3D(values[0], values[1], values[2]);
double value = values[3];
Set<Long> cellIDs = smallBodyModel.findClosestCellsWithinRadius(xyz.toArray(), radius);
// cellIDs.add(smallBodyModel.findClosestCell(xyz));
for (Long cellID : cellIDs) {
DescriptiveStatistics stats = statMap.get(cellID);
TriangularFacet tf = PolyDataUtil.getFacet(polyData, cellID);
Vector3D p = MathConversions.toVector3D(tf.getCenter());
double dist = p.distance(xyz);
// cell center can be farther than radius as long as one point is closer than //
// radius
if (dist < radius) {
double thisValue = value;
if (weight) thisValue *= (1 - dist / radius);
stats.addValue(thisValue);
// if (thisValue < 0)
// System.out.printf("Cell %d dist %f radius %f xyz %f %f %f value %f thisValue %e\n",
// cellID, dist, radius,
// xyz[0], xyz[1], xyz[2],
// value, thisValue);
}
} // cell loop
} // values loop
return statMap;
}
public TreeMap<Integer, DescriptiveStatistics> getStats(ArrayList<double[]> valuesList, double radius) {
// for each value, store indices of closest cells and distances
TreeMap<Integer, ArrayList<Pair<Long, Double>>> closestCells = new TreeMap<>();
for (int i = 0; i < valuesList.size(); i++) {
double[] values = valuesList.get(i);
Vector3D xyz = new Vector3D(values);
TreeSet<Long> sortedCellIDs = new TreeSet<>(smallBodyModel.findClosestCellsWithinRadius(values, radius));
sortedCellIDs.add(smallBodyModel.findClosestCell(values));
ArrayList<Pair<Long, Double>> distances = new ArrayList<>();
for (long cellID : sortedCellIDs) {
TriangularFacet tf = PolyDataUtil.getFacet(polyData, cellID);
Vector3D p = MathConversions.toVector3D(tf.getCenter());
double dist = p.distance(xyz);
distances.add(Pair.create(cellID, dist));
}
closestCells.put(i, distances);
}
}
map.put(index, values);
TreeMap<Integer, DescriptiveStatistics> statMap = new TreeMap<>();
for (int cellID = 0; cellID < polyData.GetNumberOfCells(); cellID++) {
DescriptiveStatistics stats = statMap.get(cellID);
if (stats == null) {
stats = new DescriptiveStatistics();
statMap.put(cellID, stats);
}
for (int i = 0; i < valuesList.size(); i++) {
double[] values = valuesList.get(i);
ArrayList<Pair<Long, Double>> distances = closestCells.get(i);
for (Pair<Long, Double> pair : distances) {
if (pair.getFirst().intValue() < cellID) continue;
if (pair.getFirst().intValue() > cellID) break;
if (pair.getFirst().intValue() == cellID) {
double dist = pair.getSecond();
if (dist < radius) {
double thisValue = (1 - dist / radius) * values[3];
stats.addValue(thisValue);
}
}
}
}
} // cell loop
return statMap;
}
ArrayList<String> returnList;
public static void main(String[] args) {
// run the VTK garbage collector every 30 seconds
vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetScheduleTime(30, TimeUnit.SECONDS);
vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetAutoGarbageCollection(true);
// vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetDebug(true);
if (writeVerts) {
returnList = writeVertices(map, polyData, allFacets);
} else {
returnList = writeFacets(map, allFacets, normalize, totalArea);
try {
ColorSpotsMain(args);
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
vtkObject.JAVA_OBJECT_MANAGER.getAutoGarbageCollector().SetAutoGarbageCollection(false);
}
if (cl.hasOption("outFile")) {
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("outFile"))) {
for (String s : returnList) pw.println(s);
}
} else {
for (String string : returnList) System.out.println(string);
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("additionalFields")
.hasArg()
.desc(
"Specify additional fields to write out. Allowed values are min, max, median, n, rms, sum, std, variance. "
+ "More than one field may be specified in a comma separated list (e.g. "
+ "-additionalFields sum,median,rms). Additional fields will be written out after the mean and std columns.")
.build());
options.addOption(Option.builder("allFacets")
.desc("Report values for all facets in OBJ shape model, even if facet is not within searchRadius "
+ "of any points. Prints NaN if facet not within searchRadius. Default is to only "
+ "print facets which have contributions from input points.")
.build());
options.addOption(Option.builder("info")
.required()
.hasArg()
.desc(
"Required. Name of CSV file containing value to plot."
+ " Default format is lon, lat, radius, value. See -xyz and -llOnly options for alternate formats.")
.build());
options.addOption(Option.builder("llOnly")
.desc("Format of -info file is lon, lat, value.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("normalize")
.desc("Report values per unit area (divide by total area of facets within search ellipse).")
.build());
options.addOption(Option.builder("noWeight")
.desc("Do not weight points by distance from facet/vertex.")
.build());
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Required. Name of shape model to read.")
.build());
options.addOption(Option.builder("outFile")
.hasArg()
.desc("Specify output file to store the output.")
.build());
options.addOption(Option.builder("searchRadius")
.hasArg()
.desc(
"Each facet will be colored using a weighted average of all points within searchRadius of the facet/vertex. "
+ "If not present, set to sqrt(2)/2 * mean facet edge length.")
.build());
options.addOption(Option.builder("writeVertices")
.desc("Convert output from a per facet to per vertex format. Each line will be of the form"
+ " x, y, z, value, sigma where x, y, z are the vector components of vertex V. "
+ " Default is to only report facetID, facet_value, facet_sigma.")
.build());
options.addOption(Option.builder("xyz")
.desc("Format of -info file is x, y, z, value.")
.build());
return options;
}
}
private static ArrayList<String> writeFacets(
TreeMap<Long, ArrayList<Double>> map,
boolean allFacets,
boolean normalize,
double totalArea) {
public static void ColorSpotsMain(String[] args) throws Exception {
ArrayList<String> returnList = new ArrayList<>();
TerrasaurTool defaultOBJ = new ColorSpots();
for (Long facet : map.keySet()) {
ArrayList<Double> values = map.get(facet);
Double value = values.get(0);
Double sigma = values.get(1);
if (allFacets || !value.isNaN()) {
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
final boolean writeVerts = cl.hasOption("writeVertices");
final boolean allFacets = cl.hasOption("allFacets");
final boolean normalize = cl.hasOption("normalize") && !writeVerts;
final boolean weight = !cl.hasOption("noWeight");
FORMAT format = FORMAT.LLR;
for (Option option : cl.getOptions()) {
if (option.getOpt().equals("xyz")) {
format = FORMAT.XYZ;
}
if (option.getOpt().equals("llOnly")) {
format = FORMAT.LL;
}
}
vtkPolyData polyData = PolyDataUtil.loadShapeModelAndComputeNormals(cl.getOptionValue("obj"));
double radius;
if (cl.hasOption("searchRadius")) {
radius = Double.parseDouble(cl.getOptionValue("searchRadius"));
} else {
PolyDataStatistics stats = new PolyDataStatistics(polyData);
radius = stats.getMeanEdgeLength() * Math.sqrt(2) / 2;
logger.info("Using search radius of " + radius);
}
ColorSpots cs = new ColorSpots(polyData);
ArrayList<double[]> infoValues = cs.readCSV(cl.getOptionValue("info"), format);
TreeMap<Long, DescriptiveStatistics> statMap = cs.getStatsFast(infoValues, radius, weight, writeVerts);
double totalArea = 0;
if (normalize) {
value /= totalArea;
sigma /= totalArea;
for (int facet = 0; facet < polyData.GetNumberOfCells(); facet++) {
vtkCell cell = polyData.GetCell(facet);
vtkPoints points = cell.GetPoints();
double[] pt0 = points.GetPoint(0);
double[] pt1 = points.GetPoint(1);
double[] pt2 = points.GetPoint(2);
TriangularFacet tf = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = tf.getArea();
totalArea += area;
points.Delete();
cell.Delete();
}
}
StringBuilder sb = new StringBuilder(String.format("%d, %e, %e", facet, value, sigma));
for (int i = 2; i < values.size(); i++) {
value = values.get(i);
if (normalize) value /= totalArea;
sb.append(String.format(", %e", value));
ArrayList<FIELD> fields = new ArrayList<>();
if (cl.hasOption("additionalFields")) {
for (String s :
cl.getOptionValue("additionalFields").trim().toUpperCase().split(",")) {
for (FIELD f : FIELD.values()) {
if (f.name().equalsIgnoreCase(s)) fields.add(f);
}
}
}
TreeMap<Long, ArrayList<Double>> map = new TreeMap<>();
long numPoints = (writeVerts ? polyData.GetNumberOfPoints() : polyData.GetNumberOfCells());
for (long index = 0; index < numPoints; index++) {
DescriptiveStatistics stats = statMap.get(index);
ArrayList<Double> values = new ArrayList<>();
if (stats != null) {
values.add(stats.getMean());
values.add(stats.getStandardDeviation());
for (FIELD f : fields) {
if (f == FIELD.MIN) values.add(stats.getMin());
if (f == FIELD.MAX) values.add(stats.getMax());
if (f == FIELD.MEDIAN) values.add(stats.getPercentile(50));
if (f == FIELD.N) values.add((double) stats.getN());
if (f == FIELD.RMS) values.add(Math.sqrt(stats.getSumsq() / stats.getN()));
if (f == FIELD.STD) values.add(stats.getStandardDeviation());
if (f == FIELD.SUM) values.add(stats.getSum());
if (f == FIELD.VARIANCE) values.add(stats.getVariance());
}
} else {
values.add(Double.NaN);
values.add(Double.NaN);
for (FIELD f : fields) {
if (f == FIELD.MIN) values.add(Double.NaN);
if (f == FIELD.MAX) values.add(Double.NaN);
if (f == FIELD.MEDIAN) values.add(Double.NaN);
if (f == FIELD.N) values.add(Double.NaN);
if (f == FIELD.RMS) values.add(Double.NaN);
if (f == FIELD.STD) values.add(Double.NaN);
if (f == FIELD.SUM) values.add(Double.NaN);
if (f == FIELD.VARIANCE) values.add(Double.NaN);
}
}
map.put(index, values);
}
ArrayList<String> returnList;
if (writeVerts) {
returnList = writeVertices(map, polyData, allFacets);
} else {
returnList = writeFacets(map, allFacets, normalize, totalArea);
}
if (cl.hasOption("outFile")) {
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("outFile"))) {
for (String s : returnList) pw.println(s);
}
} else {
for (String string : returnList) System.out.println(string);
}
returnList.add(sb.toString());
}
}
return returnList;
}
private static ArrayList<String> writeVertices(
TreeMap<Long, ArrayList<Double>> map, vtkPolyData polyData, boolean allFacets) {
private static ArrayList<String> writeFacets(
TreeMap<Long, ArrayList<Double>> map, boolean allFacets, boolean normalize, double totalArea) {
ArrayList<String> returnList = new ArrayList<>();
ArrayList<String> returnList = new ArrayList<>();
double[] thisPt = new double[3];
for (Long vertex : map.keySet()) {
ArrayList<Double> values = map.get(vertex);
Double value = values.get(0);
Double sigma = values.get(1);
if (allFacets || !value.isNaN()) {
// get vertex x,y,z values
polyData.GetPoint(vertex, thisPt);
StringBuilder sb =
new StringBuilder(
String.format("%e, %e, %e, %e, %e", thisPt[0], thisPt[1], thisPt[2], value, sigma));
for (int i = 2; i < values.size(); i++) {
value = values.get(i);
sb.append(String.format(", %e", value));
for (Long facet : map.keySet()) {
ArrayList<Double> values = map.get(facet);
Double value = values.get(0);
Double sigma = values.get(1);
if (allFacets || !value.isNaN()) {
if (normalize) {
value /= totalArea;
sigma /= totalArea;
}
StringBuilder sb = new StringBuilder(String.format("%d, %e, %e", facet, value, sigma));
for (int i = 2; i < values.size(); i++) {
value = values.get(i);
if (normalize) value /= totalArea;
sb.append(String.format(", %e", value));
}
returnList.add(sb.toString());
}
}
returnList.add(sb.toString());
}
return returnList;
}
private static ArrayList<String> writeVertices(
TreeMap<Long, ArrayList<Double>> map, vtkPolyData polyData, boolean allFacets) {
ArrayList<String> returnList = new ArrayList<>();
double[] thisPt = new double[3];
for (Long vertex : map.keySet()) {
ArrayList<Double> values = map.get(vertex);
Double value = values.get(0);
Double sigma = values.get(1);
if (allFacets || !value.isNaN()) {
// get vertex x,y,z values
polyData.GetPoint(vertex, thisPt);
StringBuilder sb = new StringBuilder(
String.format("%e, %e, %e, %e, %e", thisPt[0], thisPt[1], thisPt[2], value, sigma));
for (int i = 2; i < values.size(); i++) {
value = values.get(i);
sb.append(String.format(", %e", value));
}
returnList.add(sb.toString());
}
}
return returnList;
}
return returnList;
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -45,195 +45,223 @@ import vtk.vtkPolyData;
public class CreateSBMTStructure implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private CreateSBMTStructure() {}
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private CreateSBMTStructure() {}
@Override
public String shortDescription() {
return "Construct ellipses from user-defined points on an image.";
}
@Override
public String shortDescription() {
return "Construct ellipses from user-defined points on an image.";
}
@Override
public String fullDescription(Options options) {
String header =
"This tool creates an SBMT ellipse file from a set of point on an image.";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
@Override
public String fullDescription(Options options) {
String header = "This tool creates an SBMT ellipse file from a set of points on an image.";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
/**
* Create an Ellipse as an SBMT structure from three points. The first two points define the long
* axis and the third point defines the short axis.
*
* @param p1 First point
* @param p2 Second point
* @param p3 Third point
* @return An SBMT structure describing the ellipse
*/
private static SBMTEllipseRecord createRecord(
int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
// Create a local coordinate system where X axis contains long axis and Y axis contains short
// axis
/**
* Create an Ellipse as an SBMT structure from three points. The first two points define the long
* axis and the third point defines the short axis.
*
* @param p1 First point
* @param p2 Second point
* @param p3 Third point
* @return An SBMT structure describing the ellipse
*/
private static SBMTEllipseRecord createRecord(int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
// Create a local coordinate system where X axis contains long axis and Y axis contains short
// axis
Vector3D origin = p1.add(p2).scalarMultiply(0.5);
Vector3D X = p1.subtract(p2).normalize();
Vector3D Y = p3.subtract(origin).normalize();
Vector3D origin = p1.add(p2).scalarMultiply(0.5);
Vector3D X = p1.subtract(p2).normalize();
Vector3D Y = p3.subtract(origin).normalize();
// Create a rotation matrix to go from body fixed frame to this local coordinate system
Rotation globalToLocal = RotationUtils.IprimaryJsecondary(X, Y);
// Create a rotation matrix to go from body fixed frame to this local coordinate system
Rotation globalToLocal = RotationUtils.IprimaryJsecondary(X, Y);
// All of these vectors should have a Z coordinate of zero
Vector3D p1Local = globalToLocal.applyTo(p1);
Vector3D p2Local = globalToLocal.applyTo(p2);
Vector3D p3Local = globalToLocal.applyTo(p3);
// All of these vectors should have a Z coordinate of zero
Vector3D p1Local = globalToLocal.applyTo(p1);
Vector3D p2Local = globalToLocal.applyTo(p2);
Vector3D p3Local = globalToLocal.applyTo(p3);
// fit an ellipse to the three points on the plane
Vector2D a = new Vector2D(p1Local.getX(), p1Local.getY());
Vector2D b = new Vector2D(p2Local.getX(), p2Local.getY());
Vector2D c = new Vector2D(p3Local.getX(), p3Local.getY());
// fit an ellipse to the three points on the plane
Vector2D a = new Vector2D(p1Local.getX(), p1Local.getY());
Vector2D b = new Vector2D(p2Local.getX(), p2Local.getY());
Vector2D c = new Vector2D(p3Local.getX(), p3Local.getY());
Vector2D center = a.add(b).scalarMultiply(0.5);
double majorAxis = a.subtract(b).getNorm();
double minorAxis = 2 * c.subtract(center).getNorm();
Vector2D center = a.add(b).scalarMultiply(0.5);
double majorAxis = a.subtract(b).getNorm();
double minorAxis = 2 * c.subtract(center).getNorm();
double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX());
double flattening = (majorAxis - minorAxis) / majorAxis;
double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX());
double flattening = (majorAxis - minorAxis) / majorAxis;
ImmutableSBMTEllipseRecord.Builder record =
ImmutableSBMTEllipseRecord.builder()
.id(id)
.name(name)
.x(origin.getX())
.y(origin.getY())
.z(origin.getZ())
.lat(origin.getDelta())
.lon(origin.getAlpha())
.radius(origin.getNorm())
.slope(0)
.elevation(0)
.acceleration(0)
.potential(0)
.diameter(majorAxis)
.flattening(flattening)
.angle(rotation)
.color(Color.BLACK)
.dummy("")
.label("");
return record.build();
}
ImmutableSBMTEllipseRecord.Builder record = ImmutableSBMTEllipseRecord.builder()
.id(id)
.name(name)
.x(origin.getX())
.y(origin.getY())
.z(origin.getZ())
.lat(origin.getDelta())
.lon(origin.getAlpha())
.radius(origin.getNorm())
.slope(0)
.elevation(0)
.acceleration(0)
.potential(0)
.diameter(majorAxis)
.flattening(flattening)
.angle(rotation)
.color(Color.BLACK)
.dummy("")
.label("");
return record.build();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("input")
.required()
.hasArg()
.desc(
"""
Required. Name or input file. This is a text file with a pair of pixel coordinates per line. The pixel
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("flipX")
.desc("If present, negate the X coordinate of the input points.")
.build());
options.addOption(Option.builder("flipY")
.desc("If present, negate the Y coordinate of the input points.")
.build());
options.addOption(Option.builder("input")
.required()
.hasArg()
.desc(
"""
Required. Name or input file. This is a text file with a pair of pixel coordinates (X and Y) per line. The pixel
coordinates are offsets from the image center. For example:
# My test file
627.51274 876.11775
630.53612 883.55992
626.3499 881.46681
89.6628 285.01
97.8027 280.126
95.0119 285.01
-13.8299 323.616
-1.9689 331.756
-11.7367 330.826
Empty lines or lines beginning with # are ignored.
Each set of three points are used to create the SBMT structures. The first two points are the long
axis and the third is a location for the semi-minor axis.""")
.build());
options.addOption(
Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(
Option.builder("output")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(
Option.builder("sumFile")
.required()
.hasArg()
.desc("Required. Name of sum file to read.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new CreateSBMTStructure();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
try {
String objFile = cl.getOptionValue("objFile");
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
if (polyData == null) {
logger.error("Cannot read shape model {}!", objFile);
System.exit(0);
}
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
List<Vector3D> intercepts = new ArrayList<>();
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
for (String line :
lines.stream().filter(s -> !(s.isBlank() || s.strip().startsWith("#"))).toList()) {
String[] parts = line.split("\\s+");
int ix = (int) Math.round(Double.parseDouble(parts[0]));
int iy = (int) Math.round(Double.parseDouble(parts[1]));
Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy);
long cellID = entry.getKey();
if (cellID > -1) intercepts.add(entry.getValue());
}
logger.info("Found {} sets of points", intercepts.size() / 3);
List<SBMTEllipseRecord> records = new ArrayList<>();
for (int i = 0; i < intercepts.size(); i += 3) {
// p1 and p2 define the long axis of the ellipse
Vector3D p1 = intercepts.get(i);
Vector3D p2 = intercepts.get(i+1);
// p3 lies on the short axis
Vector3D p3 = intercepts.get(i+2);
SBMTEllipseRecord record = createRecord(i/3, String.format("Ellipse %d", i/3), p1, p2, p3);
records.add(record);
}
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
for (SBMTEllipseRecord record : records) pw.println(record.toString());
}
logger.info("Wrote {}", cl.getOptionValue("output"));
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
throw new RuntimeException(e);
.build());
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(Option.builder("spice")
.hasArg()
.desc(
"If present, name of metakernel to read. Other required options with -spice are -date, -observer, -target, and -cameraFrame.")
.build());
options.addOption(Option.builder("date")
.hasArgs()
.desc("Only used with -spice. Date of image (e.g. 2022 SEP 26 23:11:12.649).")
.build());
options.addOption(Option.builder("observer")
.hasArg()
.desc("Only used with -spice. Observing body (e.g. DART)")
.build());
options.addOption(Option.builder("target")
.hasArg()
.desc("Only used with -spice. Target body (e.g. DIMORPHOS).")
.build());
options.addOption(Option.builder("cameraFrame")
.hasArg()
.desc("Only used with -spice. Camera frame (e.g. DART_DRACO).")
.build());
options.addOption(Option.builder("sumFile")
.required()
.hasArg()
.desc(
"Required. Name of sum file to read. This is still required with -spice, but only used as a template to create a new sum file.")
.build());
return options;
}
logger.info("Finished");
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new CreateSBMTStructure();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
try {
String objFile = cl.getOptionValue("objFile");
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
if (polyData == null) {
logger.error("Cannot read shape model {}!", objFile);
System.exit(0);
}
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
boolean flipX = cl.hasOption("flipX");
boolean flipY = cl.hasOption("flipY");
List<Vector3D> intercepts = new ArrayList<>();
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
for (String line : lines.stream()
.filter(s -> !(s.isBlank() || s.strip().startsWith("#")))
.toList()) {
String[] parts = line.split("\\s+");
int ix = (int) Math.round(Double.parseDouble(parts[0]));
int iy = (int) Math.round(Double.parseDouble(parts[1]));
if (flipX) ix *= -1;
if (flipY) iy *= -1;
Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy);
long cellID = entry.getKey();
if (cellID > -1) intercepts.add(entry.getValue());
}
logger.info("Found {} sets of points", intercepts.size() / 3);
List<SBMTEllipseRecord> records = new ArrayList<>();
for (int i = 0; i < intercepts.size(); i += 3) {
// p1 and p2 define the long axis of the ellipse
Vector3D p1 = intercepts.get(i);
Vector3D p2 = intercepts.get(i + 1);
// p3 lies on the short axis
Vector3D p3 = intercepts.get(i + 2);
SBMTEllipseRecord record = createRecord(i / 3, String.format("Ellipse %d", i / 3), p1, p2, p3);
records.add(record);
}
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
for (SBMTEllipseRecord record : records) pw.println(record.toString());
}
logger.info("Wrote {}", cl.getOptionValue("output"));
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
throw new RuntimeException(e);
}
logger.info("Finished");
}
}

View File

@@ -43,176 +43,168 @@ import terrasaur.templates.TerrasaurTool;
public class DSK2OBJ implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Create an OBJ from a DSK.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "\nCreate an OBJ from a DSK.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("body")
.hasArg()
.desc(
"If present, convert shape for named body. Default is to use the first body in the DSK.")
.build());
options.addOption(
Option.builder("dsk").hasArg().required().desc("Required. Name of input DSK.").build());
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
options.addOption(
Option.builder("printBodies")
.desc("If present, print bodies and surface ids in DSK.")
.build());
options.addOption(
Option.builder("surface")
.hasArg()
.desc(
"If present, use specified surface id. Default is to use the first surface id for the body.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new DSK2OBJ();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
System.loadLibrary("JNISpice");
String dskName = cl.getOptionValue("dsk");
File dskFile = new File(dskName);
if (!dskFile.exists()) {
logger.warn("Input DSK " + dskName + "does not exist!");
System.exit(0);
@Override
public String shortDescription() {
return "Create an OBJ from a DSK.";
}
try {
DSK dsk = DSK.openForRead(dskName);
Body[] bodies = dsk.getBodies();
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "\nCreate an OBJ from a DSK.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
if (cl.hasOption("printBodies")) {
logger.info("found bodies and surface ids:");
for (int i = 0; i < bodies.length; i++) {
Body b = bodies[i];
Surface[] surfaces = dsk.getSurfaces(b);
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d) %s", i, b.getName()));
for (Surface s : surfaces) sb.append(String.format(" %d", s.getIDCode()));
logger.info(sb.toString());
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("body")
.hasArg()
.desc("If present, convert shape for named body. Default is to use the first body in the DSK.")
.build());
options.addOption(Option.builder("dsk")
.hasArg()
.required()
.desc("Required. Name of input DSK.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
options.addOption(Option.builder("printBodies")
.desc("If present, print bodies and surface ids in DSK.")
.build());
options.addOption(Option.builder("surface")
.hasArg()
.desc("If present, use specified surface id. Default is to use the first surface id for the body.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new DSK2OBJ();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
System.loadLibrary("JNISpice");
String dskName = cl.getOptionValue("dsk");
File dskFile = new File(dskName);
if (!dskFile.exists()) {
logger.warn("Input DSK " + dskName + "does not exist!");
System.exit(0);
}
}
Body b = cl.hasOption("body") ? new Body(cl.getOptionValue("body")) : bodies[0];
boolean missingBody = true;
for (Body body : bodies) {
if (b.equals(body)) {
missingBody = false;
break;
}
}
if (missingBody) {
logger.warn(String.format("Body %s not found in DSK! Valid bodies are:", b.getName()));
for (Body body : bodies) logger.warn(body.getName());
System.exit(0);
}
try {
DSK dsk = DSK.openForRead(dskName);
Body[] bodies = dsk.getBodies();
Surface[] surfaces = dsk.getSurfaces(b);
Surface s =
cl.hasOption("surface")
? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b)
: surfaces[0];
boolean missingSurface = true;
for (Surface surface : surfaces) {
if (s.equals(surface)) {
missingSurface = false;
break;
}
}
if (missingSurface) {
logger.warn(
String.format(
"Surface %d for body %s not found in DSK! Valid surfaces are:",
s.getIDCode(), b.getName()));
for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode()));
System.exit(0);
}
DLADescriptor dladsc = dsk.beginBackwardSearch();
boolean found = true;
while (found) {
DSKDescriptor dskdsc = dsk.getDSKDescriptor(dladsc);
if (b.getIDCode() == dskdsc.getCenterID() && s.getIDCode() == dskdsc.getSurfaceID()) {
// number of plates and vertices
int[] np = new int[1];
int[] nv = new int[1];
CSPICE.dskz02(dsk.getHandle(), dladsc.toArray(), nv, np);
double[][] vertices = CSPICE.dskv02(dsk.getHandle(), dladsc.toArray(), 1, nv[0]);
int[][] plates = CSPICE.dskp02(dsk.getHandle(), dladsc.toArray(), 1, np[0]);
if (cl.hasOption("obj")) {
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("obj"))) {
for (double[] v : vertices) {
pw.printf("v %20.16f %20.16f %20.16f\r\n", v[0], v[1], v[2]);
}
for (int[] p : plates) {
pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]);
}
logger.info(
String.format(
"Wrote %d vertices and %d plates to %s for body %d surface %d",
nv[0],
np[0],
cl.getOptionValue("obj"),
dskdsc.getCenterID(),
dskdsc.getSurfaceID()));
} catch (FileNotFoundException e) {
logger.warn(e.getLocalizedMessage());
if (cl.hasOption("printBodies")) {
logger.info("found bodies and surface ids:");
for (int i = 0; i < bodies.length; i++) {
Body b = bodies[i];
Surface[] surfaces = dsk.getSurfaces(b);
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d) %s", i, b.getName()));
for (Surface s : surfaces) sb.append(String.format(" %d", s.getIDCode()));
logger.info(sb.toString());
}
}
}
}
found = dsk.hasPrevious(dladsc);
if (found) {
dladsc = dsk.getPrevious(dladsc);
}
}
Body b = cl.hasOption("body") ? new Body(cl.getOptionValue("body")) : bodies[0];
boolean missingBody = true;
for (Body body : bodies) {
if (b.equals(body)) {
missingBody = false;
break;
}
}
if (missingBody) {
logger.warn(String.format("Body %s not found in DSK! Valid bodies are:", b.getName()));
for (Body body : bodies) logger.warn(body.getName());
System.exit(0);
}
} catch (SpiceException e) {
logger.warn(e.getLocalizedMessage());
Surface[] surfaces = dsk.getSurfaces(b);
Surface s = cl.hasOption("surface")
? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b)
: surfaces[0];
boolean missingSurface = true;
for (Surface surface : surfaces) {
if (s.equals(surface)) {
missingSurface = false;
break;
}
}
if (missingSurface) {
logger.warn(String.format(
"Surface %d for body %s not found in DSK! Valid surfaces are:", s.getIDCode(), b.getName()));
for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode()));
System.exit(0);
}
DLADescriptor dladsc = dsk.beginBackwardSearch();
boolean found = true;
while (found) {
DSKDescriptor dskdsc = dsk.getDSKDescriptor(dladsc);
if (b.getIDCode() == dskdsc.getCenterID() && s.getIDCode() == dskdsc.getSurfaceID()) {
// number of plates and vertices
int[] np = new int[1];
int[] nv = new int[1];
CSPICE.dskz02(dsk.getHandle(), dladsc.toArray(), nv, np);
double[][] vertices = CSPICE.dskv02(dsk.getHandle(), dladsc.toArray(), 1, nv[0]);
int[][] plates = CSPICE.dskp02(dsk.getHandle(), dladsc.toArray(), 1, np[0]);
if (cl.hasOption("obj")) {
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("obj"))) {
for (double[] v : vertices) {
pw.printf("v %20.16f %20.16f %20.16f\r\n", v[0], v[1], v[2]);
}
for (int[] p : plates) {
pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]);
}
logger.info(String.format(
"Wrote %d vertices and %d plates to %s for body %d surface %d",
nv[0],
np[0],
cl.getOptionValue("obj"),
dskdsc.getCenterID(),
dskdsc.getSurfaceID()));
} catch (FileNotFoundException e) {
logger.warn(e.getLocalizedMessage());
}
}
}
found = dsk.hasPrevious(dladsc);
if (found) {
dladsc = dsk.getPrevious(dladsc);
}
}
} catch (SpiceException e) {
logger.warn(e.getLocalizedMessage());
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -43,56 +43,54 @@ import terrasaur.utils.AppVersion;
public class DumpConfig implements TerrasaurTool {
@Override
public String shortDescription() {
return "Write a sample configuration file to use with Terrasaur, using defaults for DART.";
}
@Override
public String fullDescription(Options options) {
return WordUtils.wrap(
"This program writes out sample configuration files to be used with Terrasaur. "
+ "It takes a single argument, which is the name of the directory that will contain "
+ "the configuration files to be written.",
80);
}
public static void main(String[] args) {
// if no arguments, print the usage and exit
if (args.length == 0) {
System.out.println(new DumpConfig().fullDescription(null));
System.exit(0);
@Override
public String shortDescription() {
return "Write a sample configuration file to use with Terrasaur, using defaults for DART.";
}
// if -shortDescription is specified, print short description and exit.
for (String arg : args) {
if (arg.equals("-shortDescription")) {
System.out.println(new DumpConfig().shortDescription());
System.exit(0);
}
@Override
public String fullDescription(Options options) {
return WordUtils.wrap(
"This program writes out sample configuration files to be used with Terrasaur. "
+ "It takes a single argument, which is the name of the directory that will contain "
+ "the configuration files to be written.",
80);
}
File path = Paths.get(args[0]).toFile();
public static void main(String[] args) {
// if no arguments, print the usage and exit
if (args.length == 0) {
System.out.println(new DumpConfig().fullDescription(null));
System.exit(0);
}
ConfigBlock configBlock = TerrasaurConfig.getTemplate();
try (PrintWriter pw = new PrintWriter(path)) {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock);
PropertiesConfigurationLayout layout = config.getLayout();
// if -shortDescription is specified, print short description and exit.
for (String arg : args) {
if (arg.equals("-shortDescription")) {
System.out.println(new DumpConfig().shortDescription());
System.exit(0);
}
}
String now =
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC)
.format(Instant.now());
layout.setHeaderComment(
String.format(
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
File path = Paths.get(args[0]).toFile();
config.write(pw);
} catch (ConfigurationException | IOException e) {
throw new RuntimeException(e);
ConfigBlock configBlock = TerrasaurConfig.getTemplate();
try (PrintWriter pw = new PrintWriter(path)) {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock);
PropertiesConfigurationLayout layout = config.getLayout();
String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC)
.format(Instant.now());
layout.setHeaderComment(
String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw);
} catch (ConfigurationException | IOException e) {
throw new RuntimeException(e);
}
System.out.println("Wrote config file to " + path.getAbsolutePath());
}
System.out.println("Wrote config file to " + path.getAbsolutePath());
}
}

View File

@@ -0,0 +1,261 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.io.PrintWriter;
import java.util.Arrays;
import java.util.Map;
import java.util.NavigableSet;
import java.util.TreeSet;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.math3.geometry.euclidean.threed.SphericalCoordinates;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.immutables.value.Value;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.CellInfo;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataStatistics;
import terrasaur.utils.PolyDataUtil;
import vtk.vtkIdList;
import vtk.vtkOBBTree;
import vtk.vtkPolyData;
public class FacetInfo implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private FacetInfo() {}
@Override
public String shortDescription() {
return "Print info about a facet.";
}
@Override
public String fullDescription(Options options) {
String header = "Prints information about facet(s).";
String footer =
"""
This tool prints out facet center, normal, angle between center and
normal, and other information about the specified facet(s).""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private vtkPolyData polyData;
private vtkOBBTree searchTree;
private Vector3D origin;
private FacetInfo(vtkPolyData polyData) {
this.polyData = polyData;
PolyDataStatistics stats = new PolyDataStatistics(polyData);
origin = new Vector3D(stats.getCentroid());
logger.info("Origin is at {}", origin);
logger.info("Creating search tree");
searchTree = new vtkOBBTree();
searchTree.SetDataSet(polyData);
searchTree.SetTolerance(1e-12);
searchTree.BuildLocator();
}
/**
* @param cellId id of this cell
* @return Set of neighboring cells (ones which share a vertex with this one)
*/
private NavigableSet<Long> neighbors(long cellId) {
NavigableSet<Long> neighborCellIds = new TreeSet<>();
vtkIdList vertexIdlist = new vtkIdList();
CellInfo.getCellInfo(polyData, cellId, vertexIdlist);
vtkIdList facetIdlist = new vtkIdList();
for (long i = 0; i < vertexIdlist.GetNumberOfIds(); i++) {
long vertexId = vertexIdlist.GetId(i);
polyData.GetPointCells(vertexId, facetIdlist);
}
for (long i = 0; i < facetIdlist.GetNumberOfIds(); i++) {
long id = facetIdlist.GetId(i);
if (id == cellId) continue;
neighborCellIds.add(id);
}
return neighborCellIds;
}
@Value.Immutable
public abstract static class FacetInfoLine {
public abstract long index();
public abstract Vector3D radius();
public abstract Vector3D normal();
/**
* @return facets between this and origin
*/
public abstract NavigableSet<Long> interiorIntersections();
/**
* @return facets between this and infinity
*/
public abstract NavigableSet<Long> exteriorIntersections();
public static String getHeader() {
return "# Index, "
+ "Center Lat (deg), "
+ "Center Lon (deg), "
+ "Radius, "
+ "Radial Vector, "
+ "Normal Vector, "
+ "Angle between radial and normal (deg), "
+ "facets between this and origin, "
+ "facets between this and infinity";
}
public String toCSV() {
SphericalCoordinates spc = new SphericalCoordinates(radius());
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d, ", index()));
sb.append(String.format("%.4f, ", 90 - Math.toDegrees(spc.getPhi())));
sb.append(String.format("%.4f, ", Math.toDegrees(spc.getTheta())));
sb.append(String.format("%.6f, ", spc.getR()));
sb.append(String.format("%.6f %.6f %.6f, ", radius().getX(), radius().getY(), radius().getZ()));
sb.append(String.format("%.6f %.6f %.6f, ", normal().getX(), normal().getY(), normal().getZ()));
sb.append(String.format("%.3f, ", Math.toDegrees(Vector3D.angle(radius(), normal()))));
sb.append(String.format("%d", interiorIntersections().size()));
if (!interiorIntersections().isEmpty()) {
for (long id : interiorIntersections()) sb.append(String.format(" %d", id));
}
sb.append(", ");
sb.append(String.format("%d", exteriorIntersections().size()));
if (!exteriorIntersections().isEmpty()) {
for (long id : exteriorIntersections()) sb.append(String.format(" %d", id));
}
return sb.toString();
}
}
private FacetInfoLine getFacetInfoLine(long cellId) {
CellInfo ci = CellInfo.getCellInfo(polyData, cellId, new vtkIdList());
vtkIdList cellIds = new vtkIdList();
searchTree.IntersectWithLine(origin.toArray(), ci.center().toArray(), null, cellIds);
// count up all crossings of the surface between the origin and the facet.
NavigableSet<Long> insideIds = new TreeSet<>();
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == cellId) continue;
insideIds.add(cellIds.GetId(j));
}
Vector3D infinity = ci.center().scalarMultiply(1e9);
cellIds = new vtkIdList();
searchTree.IntersectWithLine(infinity.toArray(), ci.center().toArray(), null, cellIds);
// count up all crossings of the surface between the infinity and the facet.
NavigableSet<Long> outsideIds = new TreeSet<>();
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == cellId) continue;
outsideIds.add(cellIds.GetId(j));
}
return ImmutableFacetInfoLine.builder()
.index(cellId)
.radius(ci.center())
.normal(ci.normal())
.interiorIntersections(insideIds)
.exteriorIntersections(outsideIds)
.build();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("facet")
.required()
.hasArgs()
.desc("Facet(s) to query. Separate multiple indices with whitespace.")
.build());
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("CSV file to write.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new FacetInfo();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
try {
vtkPolyData polydata = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
FacetInfo app = new FacetInfo(polydata);
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
pw.println(FacetInfoLine.getHeader());
for (long cellId : Arrays.stream(cl.getOptionValues("facet"))
.mapToLong(Long::parseLong)
.toArray()) {
pw.println(app.getFacetInfoLine(cellId).toCSV());
NavigableSet<Long> neighbors = app.neighbors(cellId);
for (long neighborCellId : neighbors)
pw.println(app.getFacetInfoLine(neighborCellId).toCSV());
}
}
logger.info("Wrote {}", cl.getOptionValue("output"));
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
logger.info("Finished.");
}
}

View File

@@ -178,24 +178,24 @@ import vtk.vtkPolyData;
*/
public class GetSpots implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
private GetSpots() {}
private GetSpots() {}
@Override
public String shortDescription() {
return "find relevant OSIRIS-REx data for assigning values to facets in an OBJ file.";
}
@Override
public String shortDescription() {
return "find relevant OSIRIS-REx data for assigning values to facets in an OBJ file.";
}
@Override
public String fullDescription(Options options) {
String header =
"""
@Override
public String fullDescription(Options options) {
String header =
"""
This program identifies those times when the boresight of instrumenttype intersects the surface
of Bennu less than a specified distance from the center of individual facets in shape model.
""";
String footer =
"""
String footer =
"""
All output is written to standard output in the following format:
F1
sclkvalue dist pos frac flag inc ems phs [otherdata]
@@ -225,424 +225,403 @@ public class GetSpots implements TerrasaurTool {
phs is the phase angle in degrees
[otherdata] is all textual data on the line following the sclk value, up to the linefeed, unchanged.
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private String spicemetakernel;
private String objfile;
private String instrument;
private String sclkfile;
private int debugLevel;
private double maxdist;
private vtkPolyData polydata;
private SmallBodyModel smallBodyModel;
private HashMap<Integer, String> coverageMap;
private int SC_ID;
private String SC_ID_String;
private ReferenceFrame BodyFixed;
private String TARGET;
private final Vector3 NORTH = new Vector3(0, 0, 1e6);
private int instID;
private PrintStream outputStream;
public GetSpots(
String spicemetakernel,
String objfile,
String instrument,
String sclkfile,
double maxdist,
int debugLevel) {
this.spicemetakernel = spicemetakernel;
this.objfile = objfile;
this.instrument = instrument;
this.sclkfile = sclkfile;
this.maxdist = maxdist;
this.debugLevel = debugLevel;
coverageMap = new HashMap<>();
outputStream = System.out;
}
/**
* Find facets covered by the FOV of the instrument. For each facet, find the distance and
* position angle of the instrument boresight and the fraction of the facet covered by the FOV.
*
* @param line String where the first "word" is an sclk time
* @throws SpiceException
*/
private void findCoverage(String line) throws SpiceException {
String[] parts = line.split(" ");
if (parts.length == 0) return;
SCLKTime sclkTime = new SCLKTime(new SCLK(SC_ID), parts[0]);
TDBTime tdbTime = new TDBTime(sclkTime.getTDBSeconds());
Instrument instrument = new Instrument(instID);
FOV fov = new FOV(instrument);
Matrix33 instrToBodyFixed =
fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight());
StateRecord sr =
new StateRecord(
new Body(SC_ID_String),
tdbTime,
BodyFixed,
new AberrationCorrection("LT+S"),
new Body(TARGET));
Vector3 scposBodyFixed = sr.getPosition();
PositionVector sunPos =
new StateRecord(
new Body("SUN"),
tdbTime,
BodyFixed,
new AberrationCorrection("LT+S"),
new Body(TARGET))
.getPosition();
double[] double3 = new double[3];
long cellID =
smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3);
if (cellID == -1) return; // no boresight intersection
Vector3 bsightIntersectVector = new Vector3(double3);
if (debugLevel > 1) {
LatitudinalCoordinates lc = new LatitudinalCoordinates(bsightIntersectVector);
System.out.printf(
"# %s %f %f %s\n",
sclkTime,
Math.toDegrees(lc.getLatitude()),
Math.toDegrees(lc.getLongitude()),
bsightIntersectVector);
return TerrasaurTool.super.fullDescription(options, header, footer);
}
// flag is 1 if any portion of the spot does not intersect the surface
int flag = 0;
Vector<Vector3> boundaryBodyFixed = new Vector<>();
if (fov.getShape().equals("RECTANGLE") || fov.getShape().equals("POLYGON")) {
for (Vector3 boundary : fov.getBoundary()) {
boundaryBodyFixed.add(instrToBodyFixed.mxv(boundary));
}
} else if (fov.getShape().equals("CIRCLE")) {
// bounds contains a single vector parallel to a ray that lies in the cone
// that makes up the boundary of the FOV
Vector3[] bounds = fov.getBoundary();
private String spicemetakernel;
private String objfile;
private String instrument;
private String sclkfile;
private int debugLevel;
private double maxdist;
private vtkPolyData polydata;
private SmallBodyModel smallBodyModel;
private HashMap<Integer, String> coverageMap;
for (int i = 0; i < 8; i++) {
// not ideal, but check every 45 degrees along the perimeter of the circle for intersection
// with the
// surface
Matrix33 rotateAlongPerimeter = new Matrix33(fov.getBoresight(), i * Math.toRadians(45));
Vector3 perimeterVector = rotateAlongPerimeter.mxv(bounds[0]);
boundaryBodyFixed.add(instrToBodyFixed.mxv(perimeterVector));
}
} else {
// TODO: add ELLIPSE
System.err.printf(
"Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
System.exit(0);
private int SC_ID;
private String SC_ID_String;
private ReferenceFrame BodyFixed;
private String TARGET;
private final Vector3 NORTH = new Vector3(0, 0, 1e6);
private int instID;
private PrintStream outputStream;
public GetSpots(
String spicemetakernel,
String objfile,
String instrument,
String sclkfile,
double maxdist,
int debugLevel) {
this.spicemetakernel = spicemetakernel;
this.objfile = objfile;
this.instrument = instrument;
this.sclkfile = sclkfile;
this.maxdist = maxdist;
this.debugLevel = debugLevel;
coverageMap = new HashMap<>();
outputStream = System.out;
}
// check all of the boundary vectors for surface intersections
for (Vector3 vector : boundaryBodyFixed) {
cellID =
smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), vector.hat().toArray(), double3);
if (cellID == -1) {
flag = 1;
break;
}
}
/**
* Find facets covered by the FOV of the instrument. For each facet, find the distance and
* position angle of the instrument boresight and the fraction of the facet covered by the FOV.
*
* @param line String where the first "word" is an sclk time
* @throws SpiceException
*/
private void findCoverage(String line) throws SpiceException {
String[] parts = line.split(" ");
if (parts.length == 0) return;
vtkIdList idList = new vtkIdList();
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
SCLKTime sclkTime = new SCLKTime(new SCLK(SC_ID), parts[0]);
TDBTime tdbTime = new TDBTime(sclkTime.getTDBSeconds());
polydata.GetCellPoints(i, idList);
double[] pt0 = polydata.GetPoint(idList.GetId(0));
double[] pt1 = polydata.GetPoint(idList.GetId(1));
double[] pt2 = polydata.GetPoint(idList.GetId(2));
Instrument instrument = new Instrument(instID);
FOV fov = new FOV(instrument);
Matrix33 instrToBodyFixed = fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight());
CellInfo ci = CellInfo.getCellInfo(polydata, i, idList);
Vector3 facetNormal = MathConversions.toVector3(ci.normal());
Vector3 facetCenter = MathConversions.toVector3(ci.center());
StateRecord sr = new StateRecord(
new Body(SC_ID_String), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET));
Vector3 scposBodyFixed = sr.getPosition();
// check that facet faces the observer
Vector3 facetToSC = scposBodyFixed.sub(facetCenter);
double emission = facetToSC.sep(facetNormal);
if (emission > Math.PI / 2) continue;
PositionVector sunPos = new StateRecord(
new Body("SUN"), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET))
.getPosition();
double dist =
findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
if (dist < maxdist) {
double[] double3 = new double[3];
long cellID = smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3);
if (cellID == -1) return; // no boresight intersection
Vector3 facetToSun = sunPos.sub(facetCenter);
double incidence = facetToSun.sep(facetNormal);
double phase = facetToSun.sep(facetToSC);
Vector3 bsightIntersectVector = new Vector3(double3);
Vector3 pt0v = new Vector3(pt0);
Vector3 pt1v = new Vector3(pt1);
Vector3 pt2v = new Vector3(pt2);
Vector3 span1 = pt1v.sub(pt0v);
Vector3 span2 = pt2v.sub(pt0v);
Plane facetPlane = new Plane(pt0v, span1, span2);
Vector3 localNorth = facetPlane.project(NORTH).sub(facetCenter);
Vector3 bsightIntersectProjection =
facetPlane.project(bsightIntersectVector).sub(facetCenter);
// 0 = North, 90 = East
double pos =
Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos;
int nCovered = 0;
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt0v.sub(scposBodyFixed)))) nCovered++;
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt1v.sub(scposBodyFixed)))) nCovered++;
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt2v.sub(scposBodyFixed)))) nCovered++;
double frac;
if (nCovered == 3) {
frac = 1;
} else {
final double sep012 = span1.negate().sep(pt2v.sub(pt1v)); // angle at vertex 1
final double sep021 = span2.negate().sep(pt1v.sub(pt2v)); // angle at vertex 2
// check 0.5*nPts^2 points if they fall in FOV
int nPts = 50;
Vector<Vector3> pointsInFacet = new Vector<>();
for (int ii = 0; ii < nPts; ii++) {
Vector3 x = pt0v.add(span1.scale(ii / (nPts - 1.)));
for (int jj = 0; jj < nPts; jj++) {
Vector3 y = x.add(span2.scale(jj / (nPts - 1.)));
// if outside the facet, angle 01y will be larger than angle 012
if (span1.negate().sep(y.sub(pt1v)) > sep012) continue;
// if outside the facet, angle 02y will be larger than angle 021
if (span2.negate().sep(y.sub(pt2v)) > sep021) continue;
pointsInFacet.add(instrToBodyFixed.mtxv(y.sub(scposBodyFixed)));
}
}
if (pointsInFacet.isEmpty()) {
frac = 0;
} else {
List<Boolean> isInFOV = SPICEUtil.isInFOV(fov, pointsInFacet);
nCovered = 0;
for (boolean b : isInFOV) if (b) nCovered++;
frac = ((double) nCovered) / pointsInFacet.size();
}
}
StringBuilder output =
new StringBuilder(
String.format(
"%s %.4f %5.1f %.1f %d %.1f %.1f %.1f",
if (debugLevel > 1) {
LatitudinalCoordinates lc = new LatitudinalCoordinates(bsightIntersectVector);
System.out.printf(
"# %s %f %f %s\n",
sclkTime,
dist,
pos,
frac * 100,
flag,
Math.toDegrees(incidence),
Math.toDegrees(emission),
Math.toDegrees(phase)));
for (int j = 1; j < parts.length; j++) output.append(String.format(" %s", parts[j]));
output.append("\n");
String coverage = coverageMap.get(i);
if (coverage == null) {
coverageMap.put(i, output.toString());
} else {
coverage += output;
coverageMap.put(i, coverage);
Math.toDegrees(lc.getLatitude()),
Math.toDegrees(lc.getLongitude()),
bsightIntersectVector);
}
}
}
}
/**
* Find the angular distance between pt1 and pt2 as seen from scPos. All coordinates are in the
* body fixed frame.
*
* @param scPos Spacecraft position
* @param pt1 Point 1
* @param pt2 Point 2
* @return distance between pt1 and pt2 in radians.
*/
private double findDist(Vector3 scPos, Vector3 pt1, Vector3 pt2) {
Vector3 scToPt1 = pt1.sub(scPos).hat();
Vector3 scToPt2 = pt2.sub(scPos).hat();
// flag is 1 if any portion of the spot does not intersect the surface
int flag = 0;
Vector<Vector3> boundaryBodyFixed = new Vector<>();
if (fov.getShape().equals("RECTANGLE") || fov.getShape().equals("POLYGON")) {
for (Vector3 boundary : fov.getBoundary()) {
boundaryBodyFixed.add(instrToBodyFixed.mxv(boundary));
}
} else if (fov.getShape().equals("CIRCLE")) {
// bounds contains a single vector parallel to a ray that lies in the cone
// that makes up the boundary of the FOV
Vector3[] bounds = fov.getBoundary();
return Math.acos(scToPt1.dot(scToPt2));
}
for (int i = 0; i < 8; i++) {
// not ideal, but check every 45 degrees along the perimeter of the circle for intersection
// with the
// surface
Matrix33 rotateAlongPerimeter = new Matrix33(fov.getBoresight(), i * Math.toRadians(45));
Vector3 perimeterVector = rotateAlongPerimeter.mxv(bounds[0]);
boundaryBodyFixed.add(instrToBodyFixed.mxv(perimeterVector));
}
} else {
// TODO: add ELLIPSE
System.err.printf("Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
System.exit(0);
}
public void printMap() {
if (debugLevel > 0) {
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
outputStream.printf("F%d\n", i + 1);
String output = coverageMap.get(i);
if (output != null) outputStream.print(coverageMap.get(i));
}
} else {
List<Integer> list = new ArrayList<>(coverageMap.keySet());
Collections.sort(list);
for (Integer i : list) {
outputStream.printf("F%d\n", i + 1);
outputStream.print(coverageMap.get(i));
}
}
outputStream.println("END");
}
// check all of the boundary vectors for surface intersections
for (Vector3 vector : boundaryBodyFixed) {
cellID = smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), vector.hat().toArray(), double3);
if (cellID == -1) {
flag = 1;
break;
}
}
public void process() throws Exception {
boolean useNEAR = false;
if (instrument.equalsIgnoreCase("OLA_LOW")) {
// instID = -64400; // ORX_OLA_BASE
// instID = -64401; // ORX_OLA_ART
instID = -64403; // ORX_OLA_LOW
}
if (instrument.equalsIgnoreCase("OLA_HIGH")) {
instID = -64402; // ORX_OLA_HIGH
} else if (instrument.equalsIgnoreCase("OTES")) {
instID = -64310; // ORX_OTES
} else if (instrument.equalsIgnoreCase("OVIRS_SCI")) {
// instID = -64320; // ORX_OVIRS <- no instrument kernel for this
instID = -64321; // ORX_OVIRS_SCI
// instID = -64322; // ORX_OVIRS_SUN
} else if (instrument.equalsIgnoreCase("REXIS")) {
instID = -64330; // ORX_REXIS
} else if (instrument.equalsIgnoreCase("REXIS_SXM")) {
instID = -64340; // ORX_REXIS_SXM
} else if (instrument.equalsIgnoreCase("POLYCAM")) {
instID = -64360; // ORX_OCAMS_POLYCAM
} else if (instrument.equalsIgnoreCase("MAPCAM")) {
instID = -64361; // ORX_OCAMS_MAPCAM
} else if (instrument.equalsIgnoreCase("SAMCAM")) {
instID = -64362; // ORX_OCAMS_SAMCAM
} else if (instrument.equalsIgnoreCase("NAVCAM")) {
// instID = -64070; // ORX_NAVCAM <- no frame kernel for this
// instID = -64081; // ORX_NAVCAM1 <- no instrument kernel for this
instID = -64082; // ORX_NAVCAM2 <- no instrument kernel for this
} else if (instrument.equalsIgnoreCase("NIS_RECT")) {
useNEAR = true;
// instID = -93021;
instID = -93023; // relative to NEAR_NIS_BASE
} else if (instrument.equalsIgnoreCase("NIS_SQUARE")) {
useNEAR = true;
// instID = -93022;
instID = -93024; // relative to NEAR_NIS_BASE
vtkIdList idList = new vtkIdList();
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
polydata.GetCellPoints(i, idList);
double[] pt0 = polydata.GetPoint(idList.GetId(0));
double[] pt1 = polydata.GetPoint(idList.GetId(1));
double[] pt2 = polydata.GetPoint(idList.GetId(2));
CellInfo ci = CellInfo.getCellInfo(polydata, i, idList);
Vector3 facetNormal = MathConversions.toVector3(ci.normal());
Vector3 facetCenter = MathConversions.toVector3(ci.center());
// check that facet faces the observer
Vector3 facetToSC = scposBodyFixed.sub(facetCenter);
double emission = facetToSC.sep(facetNormal);
if (emission > Math.PI / 2) continue;
double dist = findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
if (dist < maxdist) {
Vector3 facetToSun = sunPos.sub(facetCenter);
double incidence = facetToSun.sep(facetNormal);
double phase = facetToSun.sep(facetToSC);
Vector3 pt0v = new Vector3(pt0);
Vector3 pt1v = new Vector3(pt1);
Vector3 pt2v = new Vector3(pt2);
Vector3 span1 = pt1v.sub(pt0v);
Vector3 span2 = pt2v.sub(pt0v);
Plane facetPlane = new Plane(pt0v, span1, span2);
Vector3 localNorth = facetPlane.project(NORTH).sub(facetCenter);
Vector3 bsightIntersectProjection =
facetPlane.project(bsightIntersectVector).sub(facetCenter);
// 0 = North, 90 = East
double pos = Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos;
int nCovered = 0;
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt0v.sub(scposBodyFixed)))) nCovered++;
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt1v.sub(scposBodyFixed)))) nCovered++;
if (SPICEUtil.isInFOV(fov, instrToBodyFixed.mtxv(pt2v.sub(scposBodyFixed)))) nCovered++;
double frac;
if (nCovered == 3) {
frac = 1;
} else {
final double sep012 = span1.negate().sep(pt2v.sub(pt1v)); // angle at vertex 1
final double sep021 = span2.negate().sep(pt1v.sub(pt2v)); // angle at vertex 2
// check 0.5*nPts^2 points if they fall in FOV
int nPts = 50;
Vector<Vector3> pointsInFacet = new Vector<>();
for (int ii = 0; ii < nPts; ii++) {
Vector3 x = pt0v.add(span1.scale(ii / (nPts - 1.)));
for (int jj = 0; jj < nPts; jj++) {
Vector3 y = x.add(span2.scale(jj / (nPts - 1.)));
// if outside the facet, angle 01y will be larger than angle 012
if (span1.negate().sep(y.sub(pt1v)) > sep012) continue;
// if outside the facet, angle 02y will be larger than angle 021
if (span2.negate().sep(y.sub(pt2v)) > sep021) continue;
pointsInFacet.add(instrToBodyFixed.mtxv(y.sub(scposBodyFixed)));
}
}
if (pointsInFacet.isEmpty()) {
frac = 0;
} else {
List<Boolean> isInFOV = SPICEUtil.isInFOV(fov, pointsInFacet);
nCovered = 0;
for (boolean b : isInFOV) if (b) nCovered++;
frac = ((double) nCovered) / pointsInFacet.size();
}
}
StringBuilder output = new StringBuilder(String.format(
"%s %.4f %5.1f %.1f %d %.1f %.1f %.1f",
sclkTime,
dist,
pos,
frac * 100,
flag,
Math.toDegrees(incidence),
Math.toDegrees(emission),
Math.toDegrees(phase)));
for (int j = 1; j < parts.length; j++) output.append(String.format(" %s", parts[j]));
output.append("\n");
String coverage = coverageMap.get(i);
if (coverage == null) {
coverageMap.put(i, output.toString());
} else {
coverage += output;
coverageMap.put(i, coverage);
}
}
}
}
NativeLibraryLoader.loadVtkLibraries();
polydata = PolyDataUtil.loadShapeModelAndComputeNormals(objfile);
smallBodyModel = new SmallBodyModel(polydata);
/**
* Find the angular distance between pt1 and pt2 as seen from scPos. All coordinates are in the
* body fixed frame.
*
* @param scPos Spacecraft position
* @param pt1 Point 1
* @param pt2 Point 2
* @return distance between pt1 and pt2 in radians.
*/
private double findDist(Vector3 scPos, Vector3 pt1, Vector3 pt2) {
Vector3 scToPt1 = pt1.sub(scPos).hat();
Vector3 scToPt2 = pt2.sub(scPos).hat();
NativeLibraryLoader.loadSpiceLibraries();
CSPICE.furnsh(spicemetakernel);
if (useNEAR) {
SC_ID = -93;
SC_ID_String = "-93"; // "NEAR";
TARGET = "2000433";
BodyFixed = new ReferenceFrame("IAU_EROS");
} else {
SC_ID = -64;
SC_ID_String = "-64"; // "ORX_SPACECRAFT";
TARGET = "2101955";
BodyFixed = new ReferenceFrame("IAU_BENNU");
return Math.acos(scToPt1.dot(scToPt2));
}
List<String> sclkLines = FileUtils.readLines(new File(sclkfile), Charset.defaultCharset());
boolean foundBegin = false;
for (String line : sclkLines) {
String trimLine = line.trim();
if (trimLine.startsWith("#")) continue;
if (trimLine.startsWith("BEGIN")) {
foundBegin = true;
continue;
}
if (foundBegin && !trimLine.startsWith("END")) {
if (trimLine.startsWith("#")) continue;
findCoverage(trimLine);
}
public void printMap() {
if (debugLevel > 0) {
for (int i = 0; i < polydata.GetNumberOfCells(); ++i) {
outputStream.printf("F%d\n", i + 1);
String output = coverageMap.get(i);
if (output != null) outputStream.print(coverageMap.get(i));
}
} else {
List<Integer> list = new ArrayList<>(coverageMap.keySet());
Collections.sort(list);
for (Integer i : list) {
outputStream.printf("F%d\n", i + 1);
outputStream.print(coverageMap.get(i));
}
}
outputStream.println("END");
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("spice").required().hasArg().desc("SPICE metakernel").build());
options.addOption(Option.builder("obj").required().hasArg().desc("Shape file").build());
options.addOption(
Option.builder("instype")
.required()
.hasArg()
.desc(
"one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
.build());
options.addOption(
Option.builder("sclk")
.required()
.hasArg()
.desc(
"""
public void process() throws Exception {
boolean useNEAR = false;
if (instrument.equalsIgnoreCase("OLA_LOW")) {
// instID = -64400; // ORX_OLA_BASE
// instID = -64401; // ORX_OLA_ART
instID = -64403; // ORX_OLA_LOW
}
if (instrument.equalsIgnoreCase("OLA_HIGH")) {
instID = -64402; // ORX_OLA_HIGH
} else if (instrument.equalsIgnoreCase("OTES")) {
instID = -64310; // ORX_OTES
} else if (instrument.equalsIgnoreCase("OVIRS_SCI")) {
// instID = -64320; // ORX_OVIRS <- no instrument kernel for this
instID = -64321; // ORX_OVIRS_SCI
// instID = -64322; // ORX_OVIRS_SUN
} else if (instrument.equalsIgnoreCase("REXIS")) {
instID = -64330; // ORX_REXIS
} else if (instrument.equalsIgnoreCase("REXIS_SXM")) {
instID = -64340; // ORX_REXIS_SXM
} else if (instrument.equalsIgnoreCase("POLYCAM")) {
instID = -64360; // ORX_OCAMS_POLYCAM
} else if (instrument.equalsIgnoreCase("MAPCAM")) {
instID = -64361; // ORX_OCAMS_MAPCAM
} else if (instrument.equalsIgnoreCase("SAMCAM")) {
instID = -64362; // ORX_OCAMS_SAMCAM
} else if (instrument.equalsIgnoreCase("NAVCAM")) {
// instID = -64070; // ORX_NAVCAM <- no frame kernel for this
// instID = -64081; // ORX_NAVCAM1 <- no instrument kernel for this
instID = -64082; // ORX_NAVCAM2 <- no instrument kernel for this
} else if (instrument.equalsIgnoreCase("NIS_RECT")) {
useNEAR = true;
// instID = -93021;
instID = -93023; // relative to NEAR_NIS_BASE
} else if (instrument.equalsIgnoreCase("NIS_SQUARE")) {
useNEAR = true;
// instID = -93022;
instID = -93024; // relative to NEAR_NIS_BASE
}
NativeLibraryLoader.loadVtkLibraries();
polydata = PolyDataUtil.loadShapeModelAndComputeNormals(objfile);
smallBodyModel = new SmallBodyModel(polydata);
NativeLibraryLoader.loadSpiceLibraries();
CSPICE.furnsh(spicemetakernel);
if (useNEAR) {
SC_ID = -93;
SC_ID_String = "-93"; // "NEAR";
TARGET = "2000433";
BodyFixed = new ReferenceFrame("IAU_EROS");
} else {
SC_ID = -64;
SC_ID_String = "-64"; // "ORX_SPACECRAFT";
TARGET = "2101955";
BodyFixed = new ReferenceFrame("IAU_BENNU");
}
List<String> sclkLines = FileUtils.readLines(new File(sclkfile), Charset.defaultCharset());
boolean foundBegin = false;
for (String line : sclkLines) {
String trimLine = line.trim();
if (trimLine.startsWith("#")) continue;
if (trimLine.startsWith("BEGIN")) {
foundBegin = true;
continue;
}
if (foundBegin && !trimLine.startsWith("END")) {
if (trimLine.startsWith("#")) continue;
findCoverage(trimLine);
}
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("SPICE metakernel")
.build());
options.addOption(
Option.builder("obj").required().hasArg().desc("Shape file").build());
options.addOption(Option.builder("instype")
.required()
.hasArg()
.desc("one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
.build());
options.addOption(Option.builder("sclk")
.required()
.hasArg()
.desc(
"""
file containing sclk values for instrument observation times. All values between the strings BEGIN and END will be processed.
For example:
BEGIN
3/605862045.24157
END""")
.build());
options.addOption(
Option.builder("maxdist")
.required()
.hasArg()
.desc("maximum distance of boresight from facet center in milliradians")
.build());
options.addOption(
Option.builder("all-facets")
.desc(
"Optional. If present, entries for all facets will be output, even if there is no intersection.")
.build());
options.addOption(
Option.builder("verbose")
.hasArg()
.desc(
"Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new GetSpots();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
String spicemetakernel = cl.getOptionValue("spice");
String objfile = cl.getOptionValue("obj");
String instrumenttype = cl.getOptionValue("instype");
String sclkfile = cl.getOptionValue("sclk");
double distance = Double.parseDouble(cl.getOptionValue("maxdist"));
int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0"));
if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1;
GetSpots gs =
new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
try {
gs.process();
gs.printMap();
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
.build());
options.addOption(Option.builder("maxdist")
.required()
.hasArg()
.desc("maximum distance of boresight from facet center in milliradians")
.build());
options.addOption(Option.builder("all-facets")
.desc("Optional. If present, entries for all facets will be output, even if there is no intersection.")
.build());
options.addOption(Option.builder("verbose")
.hasArg()
.desc(
"Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new GetSpots();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
String spicemetakernel = cl.getOptionValue("spice");
String objfile = cl.getOptionValue("obj");
String instrumenttype = cl.getOptionValue("instype");
String sclkfile = cl.getOptionValue("sclk");
double distance = Double.parseDouble(cl.getOptionValue("maxdist"));
int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0"));
if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1;
GetSpots gs = new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
try {
gs.process();
gs.printMap();
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
}
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -57,492 +57,470 @@ import vtk.vtkPolyDataWriter;
public class PointCloudFormatConverter implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Convert an input point cloud to a new format.";
}
@Override
public String shortDescription() {
return "Convert an input point cloud to a new format.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
This program converts an input point cloud to a new format.
Supported input formats are ASCII, BIN3 (x,y,z), BIN4 (x, y, z, w), BIN7 (t, x, y, z, s/c x, y, z), FITS, ICQ, OBJ, PLY, and VTK. Supported output formats are ASCII, BIN3, OBJ, and VTK.
ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private FORMATS inFormat;
private FORMATS outFormat;
private vtkPoints pointsXYZ;
// private List<Double> receivedIntensity;
private vtkPolyData polyData;
private Vector3 center;
private int halfSize;
private double groundSampleDistance;
private double clip;
private String additionalGMTArgs;
private double mapRadius;
public static vtkPoints readPointCloud(String filename) {
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(filename, FORMATS.VTK);
pcfc.read(filename, false);
return pcfc.getPoints();
}
private PointCloudFormatConverter() {}
public PointCloudFormatConverter(FORMATS inFormat, String outFilename) {
this(inFormat, FORMATS.formatFromExtension(outFilename));
}
public PointCloudFormatConverter(String inFilename, FORMATS outFormat) {
this(FORMATS.formatFromExtension(inFilename), outFormat);
}
public PointCloudFormatConverter(String inFilename, String outFilename) {
this(FORMATS.formatFromExtension(inFilename), FORMATS.formatFromExtension(outFilename));
}
public PointCloudFormatConverter(FORMATS inFormat, FORMATS outFormat) {
this.inFormat = inFormat;
this.outFormat = outFormat;
this.pointsXYZ = new vtkPoints();
this.polyData = null;
this.center = null;
this.mapRadius = Math.sqrt(2);
this.halfSize = -1;
this.groundSampleDistance = -1;
this.clip = 1;
this.additionalGMTArgs = "";
}
public PointCloudFormatConverter setPoints(vtkPoints pointsXYZ) {
this.pointsXYZ = pointsXYZ;
return this;
}
public vtkPoints getPoints() {
return pointsXYZ;
}
public void setClip(Double clip) {
this.clip = clip;
}
public void setCenter(double[] centerPt) {
center = new Vector3(centerPt);
}
public void setMapRadius(double mapRadius) {
this.mapRadius = mapRadius;
}
public PointCloudFormatConverter setHalfSize(int halfSize) {
this.halfSize = halfSize;
return this;
}
public PointCloudFormatConverter setGroundSampleDistance(double groundSampleDistance) {
this.groundSampleDistance = groundSampleDistance;
return this;
}
public PointCloudFormatConverter setGMTArgs(String args) {
this.additionalGMTArgs = args;
return this;
}
public void read(String inFile, boolean inLLR) {
switch (inFormat) {
case ASCII:
try (BufferedReader br = new BufferedReader(new FileReader(inFile))) {
String line = br.readLine();
while (line != null) {
line = line.trim();
if (!line.isEmpty() && !line.startsWith("#")) {
String[] parts = line.split("\\s+");
if (inLLR) {
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
double range = Double.parseDouble(parts[2].trim());
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray();
pointsXYZ.InsertNextPoint(xyz);
} else {
double[] xyz = new double[3];
xyz[0] = Double.parseDouble(parts[0].trim());
xyz[1] = Double.parseDouble(parts[1].trim());
xyz[2] = Double.parseDouble(parts[2].trim());
pointsXYZ.InsertNextPoint(xyz);
}
}
line = br.readLine();
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case BIN3:
case BIN4:
case BIN7:
try (DataInputStream dis =
new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
while (dis.available() > 0) {
if (inFormat == FORMATS.BIN7) {
// skip time field
BinaryUtils.readDoubleAndSwap(dis);
}
if (inLLR) {
double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double range = BinaryUtils.readDoubleAndSwap(dis);
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray();
pointsXYZ.InsertNextPoint(xyz);
} else {
double[] xyz = new double[3];
xyz[0] = BinaryUtils.readDoubleAndSwap(dis);
xyz[1] = BinaryUtils.readDoubleAndSwap(dis);
xyz[2] = BinaryUtils.readDoubleAndSwap(dis);
pointsXYZ.InsertNextPoint(xyz);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
case ICQ:
case OBJ:
case PLT:
case PLY:
case VTK:
try {
polyData = PolyDataUtil.loadShapeModel(inFile);
pointsXYZ.DeepCopy(polyData.GetPoints());
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case FITS:
try {
polyData = PolyDataUtil.loadFITShapeModel(inFile);
pointsXYZ.DeepCopy(polyData.GetPoints());
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
default:
break;
return TerrasaurTool.super.fullDescription(options, header, footer);
}
if (clip != 1) {
BoundingBox bbox = new BoundingBox();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
double[] point = pointsXYZ.GetPoint(i);
bbox.update(new UnwritableVectorIJK(point[0], point[1], point[2]));
}
BoundingBox clipped = bbox.getScaledBoundingBox(clip);
vtkPoints clippedPoints = new vtkPoints();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
if (clipped.contains(pointsXYZ.GetPoint(i)))
clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
}
pointsXYZ = clippedPoints;
polyData = null;
private FORMATS inFormat;
private FORMATS outFormat;
private vtkPoints pointsXYZ;
// private List<Double> receivedIntensity;
private vtkPolyData polyData;
private Vector3 center;
private int halfSize;
private double groundSampleDistance;
private double clip;
private String additionalGMTArgs;
private double mapRadius;
public static vtkPoints readPointCloud(String filename) {
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(filename, FORMATS.VTK);
pcfc.read(filename, false);
return pcfc.getPoints();
}
}
public void write(String outFile, boolean outLLR) {
switch (outFormat) {
case ASCII:
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(outFile)))) {
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
double[] thisPoint = pointsXYZ.GetPoint(i);
if (outLLR) {
Vector3D v = new Vector3D(thisPoint);
out.printf(
"%f %f %f\n",
Math.toDegrees(v.getAlpha()), Math.toDegrees(v.getDelta()), v.getNorm());
} else {
out.printf("%f %f %f\n", thisPoint[0], thisPoint[1], thisPoint[2]);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case BIN3:
try (DataOutputStream os =
new DataOutputStream(new BufferedOutputStream(new FileOutputStream(outFile)))) {
private PointCloudFormatConverter() {}
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
double[] thisPoint = pointsXYZ.GetPoint(i);
if (outLLR) {
Vector3D v = new Vector3D(thisPoint);
public PointCloudFormatConverter(FORMATS inFormat, String outFilename) {
this(inFormat, FORMATS.formatFromExtension(outFilename));
}
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getAlpha()));
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getDelta()));
BinaryUtils.writeDoubleAndSwap(os, v.getNorm());
} else {
for (int ii = 0; ii < 3; ii++) BinaryUtils.writeDoubleAndSwap(os, thisPoint[ii]);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
public PointCloudFormatConverter(String inFilename, FORMATS outFormat) {
this(FORMATS.formatFromExtension(inFilename), outFormat);
}
public PointCloudFormatConverter(String inFilename, String outFilename) {
this(FORMATS.formatFromExtension(inFilename), FORMATS.formatFromExtension(outFilename));
}
public PointCloudFormatConverter(FORMATS inFormat, FORMATS outFormat) {
this.inFormat = inFormat;
this.outFormat = outFormat;
this.pointsXYZ = new vtkPoints();
this.polyData = null;
this.center = null;
this.mapRadius = Math.sqrt(2);
this.halfSize = -1;
this.groundSampleDistance = -1;
this.clip = 1;
this.additionalGMTArgs = "";
}
public PointCloudFormatConverter setPoints(vtkPoints pointsXYZ) {
this.pointsXYZ = pointsXYZ;
return this;
}
public vtkPoints getPoints() {
return pointsXYZ;
}
public void setClip(Double clip) {
this.clip = clip;
}
public void setCenter(double[] centerPt) {
center = new Vector3(centerPt);
}
public void setMapRadius(double mapRadius) {
this.mapRadius = mapRadius;
}
public PointCloudFormatConverter setHalfSize(int halfSize) {
this.halfSize = halfSize;
return this;
}
public PointCloudFormatConverter setGroundSampleDistance(double groundSampleDistance) {
this.groundSampleDistance = groundSampleDistance;
return this;
}
public PointCloudFormatConverter setGMTArgs(String args) {
this.additionalGMTArgs = args;
return this;
}
public void read(String inFile, boolean inLLR) {
switch (inFormat) {
case ASCII:
try (BufferedReader br = new BufferedReader(new FileReader(inFile))) {
String line = br.readLine();
while (line != null) {
line = line.trim();
if (!line.isEmpty() && !line.startsWith("#")) {
String[] parts = line.split("\\s+");
if (inLLR) {
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
double range = Double.parseDouble(parts[2].trim());
double[] xyz = new Vector3D(lon, lat)
.scalarMultiply(range)
.toArray();
pointsXYZ.InsertNextPoint(xyz);
} else {
double[] xyz = new double[3];
xyz[0] = Double.parseDouble(parts[0].trim());
xyz[1] = Double.parseDouble(parts[1].trim());
xyz[2] = Double.parseDouble(parts[2].trim());
pointsXYZ.InsertNextPoint(xyz);
}
}
line = br.readLine();
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case BIN3:
case BIN4:
case BIN7:
try (DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
while (dis.available() > 0) {
if (inFormat == FORMATS.BIN7) {
// skip time field
BinaryUtils.readDoubleAndSwap(dis);
}
if (inLLR) {
double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double range = BinaryUtils.readDoubleAndSwap(dis);
double[] xyz =
new Vector3D(lon, lat).scalarMultiply(range).toArray();
pointsXYZ.InsertNextPoint(xyz);
} else {
double[] xyz = new double[3];
xyz[0] = BinaryUtils.readDoubleAndSwap(dis);
xyz[1] = BinaryUtils.readDoubleAndSwap(dis);
xyz[2] = BinaryUtils.readDoubleAndSwap(dis);
pointsXYZ.InsertNextPoint(xyz);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
case ICQ:
case OBJ:
case PLT:
case PLY:
case VTK:
try {
polyData = PolyDataUtil.loadShapeModel(inFile);
pointsXYZ.DeepCopy(polyData.GetPoints());
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case FITS:
try {
polyData = PolyDataUtil.loadFITShapeModel(inFile);
pointsXYZ.DeepCopy(polyData.GetPoints());
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
default:
break;
}
break;
case OBJ:
if (polyData != null) {
try {
PolyDataUtil.saveShapeModelAsOBJ(polyData, outFile);
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
} else {
if (halfSize < 0 || groundSampleDistance < 0) {
System.out.printf(
"Must supply -halfSize and -groundSampleDistance for %s output\n",
outFormat);
return;
}
final double radius = mapRadius * halfSize * groundSampleDistance;
vtkPoints vtkPoints = pointsXYZ;
if (center != null) {
vtkPoints = new vtkPoints();
if (clip != 1) {
BoundingBox bbox = new BoundingBox();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
Vector3 pt = new Vector3(pointsXYZ.GetPoint(i));
if (center.sub(new Vector3(pt)).norm() > radius) continue;
vtkPoints.InsertNextPoint(pt.toArray());
double[] point = pointsXYZ.GetPoint(i);
bbox.update(new UnwritableVectorIJK(point[0], point[1], point[2]));
}
}
PointCloudToPlane pctp = new PointCloudToPlane(vtkPoints, halfSize, groundSampleDistance);
pctp.getGMU().setFieldToHeight();
pctp.getGMU().setGMTArgs(additionalGMTArgs);
try {
double[][][] regridField = pctp.getGMU().regridField();
vtkPolyData griddedXYZ = PolyDataUtil.loadLocalFitsLLRModelN(regridField);
PolyDataUtil.saveShapeModelAsOBJ(griddedXYZ, outFile);
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
BoundingBox clipped = bbox.getScaledBoundingBox(clip);
vtkPoints clippedPoints = new vtkPoints();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
if (clipped.contains(pointsXYZ.GetPoint(i))) clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
}
pointsXYZ = clippedPoints;
polyData = null;
}
break;
case VTK:
if (polyData == null) {
polyData = new vtkPolyData();
polyData.SetPoints(pointsXYZ);
}
public void write(String outFile, boolean outLLR) {
switch (outFormat) {
case ASCII:
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(outFile)))) {
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
double[] thisPoint = pointsXYZ.GetPoint(i);
if (outLLR) {
Vector3D v = new Vector3D(thisPoint);
out.printf(
"%f %f %f\n",
Math.toDegrees(v.getAlpha()), Math.toDegrees(v.getDelta()), v.getNorm());
} else {
out.printf("%f %f %f\n", thisPoint[0], thisPoint[1], thisPoint[2]);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case BIN3:
try (DataOutputStream os =
new DataOutputStream(new BufferedOutputStream(new FileOutputStream(outFile)))) {
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
double[] thisPoint = pointsXYZ.GetPoint(i);
if (outLLR) {
Vector3D v = new Vector3D(thisPoint);
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getAlpha()));
BinaryUtils.writeDoubleAndSwap(os, Math.toDegrees(v.getDelta()));
BinaryUtils.writeDoubleAndSwap(os, v.getNorm());
} else {
for (int ii = 0; ii < 3; ii++) BinaryUtils.writeDoubleAndSwap(os, thisPoint[ii]);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
break;
case OBJ:
if (polyData != null) {
try {
PolyDataUtil.saveShapeModelAsOBJ(polyData, outFile);
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
} else {
if (halfSize < 0 || groundSampleDistance < 0) {
logger.error("Must supply -halfSize and -groundSampleDistance for {} output", outFormat);
return;
}
final double radius = mapRadius * halfSize * groundSampleDistance;
vtkPoints vtkPoints = pointsXYZ;
if (center != null) {
vtkPoints = new vtkPoints();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
Vector3 pt = new Vector3(pointsXYZ.GetPoint(i));
if (center.sub(new Vector3(pt)).norm() > radius) continue;
vtkPoints.InsertNextPoint(pt.toArray());
}
}
PointCloudToPlane pctp = new PointCloudToPlane(vtkPoints, halfSize, groundSampleDistance);
pctp.getGMU().setFieldToHeight();
pctp.getGMU().setGMTArgs(additionalGMTArgs);
try {
double[][][] regridField = pctp.getGMU().regridField();
vtkPolyData griddedXYZ = PolyDataUtil.loadLocalFitsLLRModelN(regridField);
PolyDataUtil.saveShapeModelAsOBJ(griddedXYZ, outFile);
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
}
break;
case VTK:
if (polyData == null) {
polyData = new vtkPolyData();
polyData.SetPoints(pointsXYZ);
}
vtkCellArray cells = new vtkCellArray();
vtkFloatArray albedo = new vtkFloatArray();
albedo.SetName("albedo");
polyData.SetPolys(cells);
polyData.GetPointData().AddArray(albedo);
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
vtkIdList idList = new vtkIdList();
idList.InsertNextId(i);
cells.InsertNextCell(idList);
albedo.InsertNextValue(0.5f);
}
vtkPolyDataWriter writer = new vtkPolyDataWriter();
writer.SetInputData(polyData);
writer.SetFileName(outFile);
writer.SetFileTypeToBinary();
writer.Update();
break;
default:
break;
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc("Format of input file. If not present format will be inferred from inputFile extension.")
.build());
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc("Format of output file. If not present format will be inferred from outputFile extension.")
.build());
options.addOption(Option.builder("outputFile")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(Option.builder("inllr")
.desc(
"Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.")
.build());
options.addOption(Option.builder("outllr")
.desc(
"Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.")
.build());
options.addOption(Option.builder("centerXYZ")
.hasArg()
.desc(
"Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated"
+ " by commas.")
.build());
options.addOption(Option.builder("centerLonLat")
.hasArg()
.desc(
"Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated"
+ " by a comma. Shape will be centered on the point closest to this lon,lat pair.")
.build());
options.addOption(Option.builder("halfSize")
.hasArg()
.desc(
"Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).")
.build());
options.addOption(Option.builder("groundSampleDistance")
.hasArg()
.desc(
"Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. "
+ "Units are the same as the input file, usually km.")
.build());
options.addOption(Option.builder("mapRadius")
.hasArg()
.desc(
"Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within "
+ "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).")
.build());
options.addOption(Option.builder("gmtArgs")
.hasArg()
.longOpt("gmt-args")
.desc(
"Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.")
.build());
options.addOption(Option.builder("clip")
.hasArg()
.desc(
"Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new PointCloudFormatConverter();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
String inFile = cl.getOptionValue("inputFile");
String outFile = cl.getOptionValue("outputFile");
boolean inLLR = cl.hasOption("inllr");
boolean outLLR = cl.hasOption("outllr");
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
FORMATS outFormat = cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, outFormat);
if (cl.hasOption("centerXYZ")) {
String[] params = cl.getOptionValue("centerXYZ").split(",");
double[] array = new double[3];
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
pcfc.setCenter(array);
}
vtkCellArray cells = new vtkCellArray();
vtkFloatArray albedo = new vtkFloatArray();
albedo.SetName("albedo");
polyData.SetPolys(cells);
polyData.GetPointData().AddArray(albedo);
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
vtkIdList idList = new vtkIdList();
idList.InsertNextId(i);
cells.InsertNextCell(idList);
albedo.InsertNextValue(0.5f);
if (cl.hasOption("clip")) {
pcfc.setClip(Double.valueOf(cl.getOptionValue("clip")));
}
vtkPolyDataWriter writer = new vtkPolyDataWriter();
writer.SetInputData(polyData);
writer.SetFileName(outFile);
writer.SetFileTypeToBinary();
writer.Update();
break;
default:
break;
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("inputFormat")
.hasArg()
.desc(
"Format of input file. If not present format will be inferred from inputFile extension.")
.build());
options.addOption(
Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(
Option.builder("outputFormat")
.hasArg()
.desc(
"Format of output file. If not present format will be inferred from outputFile extension.")
.build());
options.addOption(
Option.builder("outputFile")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(
Option.builder("inllr")
.desc(
"Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.")
.build());
options.addOption(
Option.builder("outllr")
.desc(
"Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.")
.build());
options.addOption(
Option.builder("centerXYZ")
.hasArg()
.desc(
"Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated"
+ " by commas.")
.build());
options.addOption(
Option.builder("centerLonLat")
.hasArg()
.desc(
"Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated"
+ " by a comma. Shape will be centered on the point closest to this lon,lat pair.")
.build());
options.addOption(
Option.builder("halfSize")
.hasArg()
.desc(
"Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).")
.build());
options.addOption(
Option.builder("groundSampleDistance")
.hasArg()
.desc(
"Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. "
+ "Units are the same as the input file, usually km.")
.build());
options.addOption(
Option.builder("mapRadius")
.hasArg()
.desc(
"Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within "
+ "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).")
.build());
options.addOption(
Option.builder("gmtArgs")
.hasArg()
.longOpt("gmt-args")
.desc(
"Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.")
.build());
options.addOption(
Option.builder("clip")
.hasArg()
.desc(
"Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new PointCloudFormatConverter();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
String inFile = cl.getOptionValue("inputFile");
String outFile = cl.getOptionValue("outputFile");
boolean inLLR = cl.hasOption("inllr");
boolean outLLR = cl.hasOption("outllr");
FORMATS inFormat =
cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
FORMATS outFormat =
cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, outFormat);
if (cl.hasOption("centerXYZ")) {
String[] params = cl.getOptionValue("centerXYZ").split(",");
double[] array = new double[3];
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
pcfc.setCenter(array);
}
if (cl.hasOption("clip")) {
pcfc.setClip(Double.valueOf(cl.getOptionValue("clip")));
}
if (cl.hasOption("gmtArgs")) {
StringBuilder gmtArgs = new StringBuilder();
for (String arg : cl.getOptionValues("gmtArgs")) gmtArgs.append(String.format("%s ", arg));
pcfc.setGMTArgs(gmtArgs.toString());
}
pcfc.read(inFile, inLLR);
if (cl.hasOption("centerLonLat")) {
String[] params = cl.getOptionValue("centerLonLat").split(",");
Vector3D lcDir =
new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim())));
double[] center = null;
double minSep = Double.MAX_VALUE;
vtkPoints vtkPoints = pcfc.getPoints();
for (int i = 0; i < vtkPoints.GetNumberOfPoints(); i++) {
double[] pt = vtkPoints.GetPoint(i);
double sep = Vector3D.angle(lcDir, new Vector3D(pt));
if (sep < minSep) {
minSep = sep;
center = pt;
if (cl.hasOption("gmtArgs")) {
StringBuilder gmtArgs = new StringBuilder();
for (String arg : cl.getOptionValues("gmtArgs")) gmtArgs.append(String.format("%s ", arg));
pcfc.setGMTArgs(gmtArgs.toString());
}
}
pcfc.setCenter(center);
pcfc.read(inFile, inLLR);
if (cl.hasOption("centerLonLat")) {
String[] params = cl.getOptionValue("centerLonLat").split(",");
Vector3D lcDir = new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim())));
double[] center = null;
double minSep = Double.MAX_VALUE;
vtkPoints vtkPoints = pcfc.getPoints();
for (int i = 0; i < vtkPoints.GetNumberOfPoints(); i++) {
double[] pt = vtkPoints.GetPoint(i);
double sep = Vector3D.angle(lcDir, new Vector3D(pt));
if (sep < minSep) {
minSep = sep;
center = pt;
}
}
pcfc.setCenter(center);
}
pcfc.setMapRadius(
cl.hasOption("mapRadius") ? Double.parseDouble(cl.getOptionValue("mapRadius")) : Math.sqrt(2));
if (cl.hasOption("halfSize") && cl.hasOption("groundSampleDistance")) {
// resample on a uniform XY grid
pcfc.setHalfSize(Integer.parseInt(cl.getOptionValue("halfSize")));
pcfc.setGroundSampleDistance(Double.parseDouble(cl.getOptionValue("groundSampleDistance")));
}
pcfc.write(outFile, outLLR);
}
pcfc.setMapRadius(
cl.hasOption("mapRadius") ? Double.parseDouble(cl.getOptionValue("mapRadius")) : Math.sqrt(2));
if (cl.hasOption("halfSize") && cl.hasOption("groundSampleDistance")) {
// resample on a uniform XY grid
pcfc.setHalfSize(Integer.parseInt(cl.getOptionValue("halfSize")));
pcfc.setGroundSampleDistance(Double.parseDouble(cl.getOptionValue("groundSampleDistance")));
}
pcfc.write(outFile, outLLR);
}
}

View File

@@ -0,0 +1,399 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.awt.geom.Path2D;
import java.awt.geom.Point2D;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import nom.tam.fits.FitsException;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.commons.math3.geometry.euclidean.twod.Vector2D;
import org.apache.commons.math3.geometry.euclidean.twod.hull.MonotoneChain;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import picante.math.coords.LatitudinalVector;
import spice.basic.SpiceException;
import spice.basic.Vector3;
import terrasaur.enums.FORMATS;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.VectorStatistics;
import terrasaur.utils.tessellation.StereographicProjection;
import vtk.vtkPoints;
public class PointCloudOverlap implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Find points in a point cloud which overlap a reference point cloud.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"\nThis program finds all points in the input point cloud which overlap the points in a reference point cloud.\n\n"
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK. Supported output formats are ASCII, BINARY, L2, and VTK. "
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.\n\n"
+ "A plane is fit to the reference point cloud and all points in each cloud are projected onto this plane. Any point in the "
+ "projected input cloud which falls within the outline of the projected reference cloud is considered to be overlapping.";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private Path2D.Double polygon;
private StereographicProjection proj;
/*- Useful for debugging
private vtkPoints ref2DPoints;
private vtkPoints input2DPoints;
private vtkPolyData polygonPolyData;
private vtkCellArray polygonCells;
private vtkPoints polygonPoints;
private vtkDoubleArray polygonSuccessArray;
*/
public PointCloudOverlap(Collection<double[]> refPoints) {
if (refPoints != null) {
VectorStatistics vStats = new VectorStatistics();
for (double[] pt : refPoints) vStats.add(new Vector3(pt));
Vector3D centerXYZ = vStats.getMean();
proj = new StereographicProjection(new LatitudinalVector(1, centerXYZ.getDelta(), centerXYZ.getAlpha()));
createRefPolygon(refPoints);
}
}
public StereographicProjection getProjection() {
return proj;
}
private void createRefPolygon(Collection<double[]> refPoints) {
List<Vector2D> stereographicPoints = new ArrayList<>();
for (double[] refPt : refPoints) {
Vector3D point3D = new Vector3D(refPt);
Point2D point = proj.forward(point3D.getDelta(), point3D.getAlpha());
stereographicPoints.add(new Vector2D(point.getX(), point.getY()));
}
/*-
ref2DPoints = new vtkPoints();
input2DPoints = new vtkPoints();
polygonPolyData = new vtkPolyData();
polygonCells = new vtkCellArray();
polygonPoints = new vtkPoints();
polygonSuccessArray = new vtkDoubleArray();
polygonSuccessArray.SetName("success")
polygonPolyData.SetPoints(polygonPoints);
polygonPolyData.SetLines(polygonCells);
polygonPolyData.GetCellData().AddArray(polygonSuccessArray);
for (Vector2D refPoint : refPoints)
ref2DPoints.InsertNextPoint(refPoint.getX(), refPoint.getY(), 0);
*/
MonotoneChain mc = new MonotoneChain();
List<Vector2D> vertices = new ArrayList<>(mc.findHullVertices(stereographicPoints));
/*-
for (int i = 1; i < vertices.size(); i++) {
Vector2D lastPt = vertices.get(i - 1);
Vector2D thisPt = vertices.get(i);
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
thisPt.getY(), lastPt.distance(thisPt));
}
Vector2D lastPt = vertices.get(vertices.size() - 1);
Vector2D thisPt = vertices.get(0);
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
thisPt.getY(), lastPt.distance(thisPt));
*/
// int id0 = 0;
for (Vector2D vertex : vertices) {
// int id1 = polygonPoints.InsertNextPoint(vertex.getX(), vertex.getY(), 0);
if (polygon == null) {
polygon = new Path2D.Double();
polygon.moveTo(vertex.getX(), vertex.getY());
} else {
polygon.lineTo(vertex.getX(), vertex.getY());
/*-
vtkLine line = new vtkLine();
line.GetPointIds().SetId(0, id0);
line.GetPointIds().SetId(1, id1);
polygonCells.InsertNextCell(line);
*/
}
// id0 = id1;
}
polygon.closePath();
/*-
vtkPolyDataWriter writer = new vtkPolyDataWriter();
writer.SetInputData(polygonPolyData);
writer.SetFileName("polygon2D.vtk");
writer.SetFileTypeToBinary();
writer.Update();
writer = new vtkPolyDataWriter();
polygonPolyData = new vtkPolyData();
polygonPolyData.SetPoints(ref2DPoints);
writer.SetInputData(polygonPolyData);
writer.SetFileName("refPoints.vtk");
writer.SetFileTypeToBinary();
writer.Update();
*/
}
public boolean isEnclosed(double[] xyz) {
Vector3D point = new Vector3D(xyz);
Point2D projected = proj.forward(point.getDelta(), point.getAlpha());
return polygon.contains(projected.getX(), projected.getY());
}
/**
* @param inputPoints points to consider
* @param scale scale factor
* @return indices of all points inside the scaled polygon
*/
public List<Integer> scalePoints(List<double[]> inputPoints, double scale) {
List<Vector2D> projected = new ArrayList<>();
for (double[] inputPoint : inputPoints) {
Vector3D point = new Vector3D(inputPoint);
Point2D projectedPoint = proj.forward(point.getDelta(), point.getAlpha());
projected.add(new Vector2D(projectedPoint.getX(), projectedPoint.getY()));
}
Vector2D center = new Vector2D(0, 0);
for (Vector2D inputPoint : projected) center = center.add(inputPoint);
center = center.scalarMultiply(1. / inputPoints.size());
List<Vector2D> translatedPoints = new ArrayList<>();
for (Vector2D inputPoint : projected) translatedPoints.add(inputPoint.subtract(center));
Path2D.Double thisPolygon = null;
MonotoneChain mc = new MonotoneChain();
Collection<Vector2D> vertices = mc.findHullVertices(translatedPoints);
for (Vector2D vertex : vertices) {
if (thisPolygon == null) {
thisPolygon = new Path2D.Double();
thisPolygon.moveTo(scale * vertex.getX(), scale * vertex.getY());
} else {
thisPolygon.lineTo(scale * vertex.getX(), scale * vertex.getY());
}
}
thisPolygon.closePath();
List<Integer> indices = new ArrayList<>();
for (int i = 0; i < projected.size(); i++) {
Vector2D inputPoint = projected.get(i);
if (thisPolygon.contains(inputPoint.getX() - center.getX(), inputPoint.getY() - center.getY()))
indices.add(i);
}
return indices;
}
private static Options defineOptions() {
Options options = new Options();
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc("Format of input file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(Option.builder("inllr")
.desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("referenceFormat")
.hasArg()
.desc("Format of reference file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("referenceFile")
.required()
.hasArg()
.desc("Required. Name of reference file.")
.build());
options.addOption(Option.builder("refllr")
.desc(
"If present, reference values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc("Format of output file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("outputFile")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(Option.builder("outllr")
.desc(
"If present, output values will be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("scale")
.hasArg()
.desc("Value to scale bounding box containing intersect region. Default is 1.0.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException, IOException, InterruptedException, FitsException {
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
TerrasaurTool defaultOBJ = new PointCloudOverlap(null);
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// Read the reference file
FORMATS refFormat = cl.hasOption("referenceFormat")
? FORMATS.valueOf(cl.getOptionValue("referenceFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("referenceFile"));
String refFile = cl.getOptionValue("referenceFile");
boolean refLLR = cl.hasOption("refllr");
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(refFormat, FORMATS.VTK);
pcfc.read(refFile, refLLR);
vtkPoints referencePoints = pcfc.getPoints();
logger.info("{} points read from {}", referencePoints.GetNumberOfPoints(), refFile);
List<double[]> refPts = new ArrayList<>();
for (int i = 0; i < referencePoints.GetNumberOfPoints(); i++) {
refPts.add(referencePoints.GetPoint(i));
}
// create the overlap object and set the enclosing polygon
PointCloudOverlap pco = new PointCloudOverlap(refPts);
// Read the input point cloud
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr");
pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
pcfc.read(inFile, inLLR);
vtkPoints inputPoints = pcfc.getPoints();
logger.info("{} points read from {}", inputPoints.GetNumberOfPoints(), inFile);
List<Integer> enclosedIndices = new ArrayList<>();
for (int i = 0; i < inputPoints.GetNumberOfPoints(); i++) {
double[] pt = inputPoints.GetPoint(i);
if (pco.isEnclosed(pt)) enclosedIndices.add(i);
}
if (cl.hasOption("scale")) {
List<double[]> pts = new ArrayList<>();
for (Integer i : enclosedIndices) pts.add(inputPoints.GetPoint(i));
// this list includes which of the enclosed points are inside the scaled polygon
List<Integer> theseIndices = pco.scalePoints(pts, Double.parseDouble(cl.getOptionValue("scale")));
// now relate this list back to the original list of points
List<Integer> newIndices = new ArrayList<>();
for (Integer i : theseIndices) newIndices.add(enclosedIndices.get(i));
enclosedIndices = newIndices;
}
VectorStatistics xyzStats = new VectorStatistics();
VectorStatistics xyStats = new VectorStatistics();
for (Integer i : enclosedIndices) {
double[] thisPt = inputPoints.GetPoint(i);
Vector3D thisPt3D = new Vector3D(thisPt);
xyzStats.add(thisPt3D);
Point2D projectedPt = pco.getProjection().forward(thisPt3D.getDelta(), thisPt3D.getAlpha());
xyStats.add(new Vector3(projectedPt.getX(), projectedPt.getY(), 0));
}
logger.info(
"Center XYZ: {}, {}, {}",
xyzStats.getMean().getX(),
xyzStats.getMean().getY(),
xyzStats.getMean().getZ());
Vector3D centerXYZ = xyzStats.getMean();
logger.info(
"Center lon, lat: {}, {}\n",
Math.toDegrees(centerXYZ.getAlpha()),
Math.toDegrees(centerXYZ.getDelta()));
logger.info(
"xmin/xmax/extent: {}/{}/{}\n",
xyzStats.getMin().getX(),
xyzStats.getMax().getX(),
xyzStats.getMax().getX() - xyzStats.getMin().getX());
logger.info(
"ymin/ymax/extent: {}/{}/{}\n",
xyzStats.getMin().getY(),
xyzStats.getMax().getY(),
xyzStats.getMax().getY() - xyzStats.getMin().getY());
logger.info(
"zmin/zmax/extent: {}/{}/{}\n",
xyzStats.getMin().getZ(),
xyzStats.getMax().getZ(),
xyzStats.getMax().getZ() - xyzStats.getMin().getZ());
FORMATS outFormat = cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
vtkPoints pointsToWrite = new vtkPoints();
for (Integer i : enclosedIndices) pointsToWrite.InsertNextPoint(inputPoints.GetPoint(i));
pcfc = new PointCloudFormatConverter(FORMATS.VTK, outFormat);
pcfc.setPoints(pointsToWrite);
String outputFilename = cl.getOptionValue("outputFile");
pcfc.write(outputFilename, cl.hasOption("outllr"));
if (new File(outputFilename).exists()) {
logger.info("{} points written to {}", pointsToWrite.GetNumberOfPoints(), outputFilename);
} else {
logger.error("Could not write {}", outputFilename);
}
logger.info("Finished");
}
}
// TODO write out center of output pointcloud

View File

@@ -64,348 +64,320 @@ import vtk.vtkPoints;
public class PointCloudToPlane implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Find a rotation and translation to transform a point cloud to a height field above the best fit plane.";
}
@Override
public String shortDescription() {
return "Find a rotation and translation to transform a point cloud to a height field above the best fit plane.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"\nThis program finds a rotation and translation to transform a point cloud to a height field above the best fit plane. "
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK.\n\n"
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates. ";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"\nThis program finds a rotation and translation to transform a point cloud to a height field above the best fit plane. "
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK.\n\n"
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates. ";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private GMTGridUtil gmu;
private GMTGridUtil gmu;
public GMTGridUtil getGMU() {
return gmu;
}
public GMTGridUtil getGMU() {
return gmu;
}
public void writeOutput(String outputFile) {
if (outputFile != null) {
try (PrintStream ps = new PrintStream(outputFile)) {
double[][] transformation = gmu.getTransformation();
for (int i = 0; i < 4; i++) {
for (int j = 0; j < 4; j++) {
ps.printf("%24.16e ", transformation[i][j]);
}
ps.println();
public void writeOutput(String outputFile) {
if (outputFile != null) {
try (PrintStream ps = new PrintStream(outputFile)) {
double[][] transformation = gmu.getTransformation();
for (int i = 0; i < 4; i++) {
for (int j = 0; j < 4; j++) {
ps.printf("%24.16e ", transformation[i][j]);
}
ps.println();
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
}
}
private PointCloudToPlane() {}
public PointCloudToPlane(vtkPoints points) {
this(points, 0, 0.);
}
public PointCloudToPlane(vtkPoints points, int halfSize, double groundSampleDistance) {
double[] x = new double[(int) points.GetNumberOfPoints()];
double[] y = new double[(int) points.GetNumberOfPoints()];
double[] z = new double[(int) points.GetNumberOfPoints()];
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
double[] thisPoint = points.GetPoint(i);
x[i] = thisPoint[0];
y[i] = thisPoint[1];
z[i] = thisPoint[2];
}
gmu = new GMTGridUtil(halfSize, groundSampleDistance);
gmu.setXYZ(x, y, z);
}
private PointCloudToPlane() {}
public BufferedImage makePlot(List<Vector3> points, String name) throws SpiceException {
DescriptiveStatistics stats = new DescriptiveStatistics();
VectorStatistics vStats = new VectorStatistics();
DiscreteDataSet data = new DiscreteDataSet(name);
PlotConfig config = ImmutablePlotConfig.builder().title(name).build();
DiscreteDataPlot canvas;
boolean orthographic = false;
if (orthographic) {
for (Vector3 p : points) {
stats.addValue(p.getElt(2));
vStats.add(p);
LatitudinalCoordinates lc = new LatitudinalCoordinates(p);
data.add(lc.getLongitude(), lc.getLatitude(), 0, p.getElt(2));
}
double min = stats.getMin();
double max = stats.getMax();
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
data.setSymbol(new Circle().setSize(1.0));
data.setColorRamp(ramp);
Vector3 center = MathConversions.toVector3(vStats.getMean());
double halfExtent = 0;
for (Vector3 p : points) {
double dist = center.sep(p);
if (dist > halfExtent) halfExtent = dist;
}
ProjectionOrthographic p =
new ProjectionOrthographic(
config.width(),
config.height(),
CoordConverters.convertToLatitudinal(
new VectorIJK(center.getElt(0), center.getElt(1), center.getElt(2))));
p.setRadius(Math.max(0.5, .6 / halfExtent));
canvas = new MapPlot(config, p);
canvas.drawAxes();
canvas.plot(data);
((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true);
canvas.drawColorBar(
ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp)
.numTicks(5)
.tickFunction(StringFunctions.fixedFormat("%.3f"))
.build());
} else {
for (Vector3 p : points) {
stats.addValue(p.getElt(2));
vStats.add(p);
data.add(p.getElt(0), p.getElt(1), 0, p.getElt(2));
}
double min = stats.getMin();
double max = stats.getMax();
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
data.setSymbol(new Circle().setSize(1.0));
data.setColorRamp(ramp);
canvas = new DiscreteDataPlot(config);
AxisX xAxis = data.defaultXAxis("X");
AxisY yAxis = data.defaultYAxis("Y");
canvas.setAxes(xAxis, yAxis);
canvas.drawAxes();
canvas.plot(data);
canvas.drawColorBar(
ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp)
.numTicks(5)
.tickFunction(StringFunctions.fixedFormat("%.3f"))
.build());
public PointCloudToPlane(vtkPoints points) {
this(points, 0, 0.);
}
return canvas.getImage();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("inputFormat")
.hasArg()
.desc(
"Format of input file. If not present format is inferred from inputFile extension.")
.build());
options.addOption(
Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(
Option.builder("inllr")
.desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("outputFile")
.hasArg()
.desc(
"Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top "
+ "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert "
+ "from global to local:\n transformed = rotation.mxv(point.sub(translation))")
.build());
options.addOption(
Option.builder("translate")
.hasArg()
.desc(
"Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas. "
+ "Default is to use centroid of input point cloud.")
.build());
options.addOption(
Option.builder("plotXYZ")
.hasArg()
.desc(
"Plot X vs Y (in the local frame) colored by Z. "
+ "Argument is the name of PNG file to write.")
.build());
options.addOption(
Option.builder("plotXYR")
.hasArg()
.desc(
"Plot X vs Y (in the local frame) colored by R. "
+ "Argument is the name of PNG file to write.")
.build());
options.addOption(
Option.builder("slope")
.desc(
"Choose local coordinate frame such that Z points normal to the plane "
+ "and X points along the direction of steepest descent.")
.build());
return options;
}
public PointCloudToPlane(vtkPoints points, int halfSize, double groundSampleDistance) {
double[] x = new double[(int) points.GetNumberOfPoints()];
double[] y = new double[(int) points.GetNumberOfPoints()];
double[] z = new double[(int) points.GetNumberOfPoints()];
public static void main(String[] args) throws SpiceException {
TerrasaurTool defaultOBJ = new PointCloudToPlane();
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
double[] thisPoint = points.GetPoint(i);
x[i] = thisPoint[0];
y[i] = thisPoint[1];
z[i] = thisPoint[2];
}
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr");
FORMATS inFormat =
cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(inFile);
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
pcfc.read(inFile, inLLR);
System.out.printf("%d points read from %s\n", pcfc.getPoints().GetNumberOfPoints(), inFile);
int halfSize = 0;
double groundSampleDistance = 0;
vtkPoints points = pcfc.getPoints();
PointCloudToPlane pctp = new PointCloudToPlane(points, halfSize, groundSampleDistance);
Vector3 translation;
if (cl.hasOption("translate")) {
translation =
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
pctp.getGMU().setTranslation(translation.toArray());
gmu = new GMTGridUtil(halfSize, groundSampleDistance);
gmu.setXYZ(x, y, z);
}
pctp.getGMU().calculateTransformation();
List<Vector3> globalPts = new ArrayList<>();
for (int i = 0; i < points.GetNumberOfPoints(); i++)
globalPts.add(new Vector3(points.GetPoint(i)));
public BufferedImage makePlot(List<Vector3> points, String name) throws SpiceException {
DescriptiveStatistics stats = new DescriptiveStatistics();
VectorStatistics vStats = new VectorStatistics();
DiscreteDataSet data = new DiscreteDataSet(name);
double[][] transformation = pctp.getGMU().getTransformation();
StringBuilder sb =
new StringBuilder(
String.format(
PlotConfig config = ImmutablePlotConfig.builder().title(name).build();
DiscreteDataPlot canvas;
boolean orthographic = false;
if (orthographic) {
for (Vector3 p : points) {
stats.addValue(p.getElt(2));
vStats.add(p);
LatitudinalCoordinates lc = new LatitudinalCoordinates(p);
data.add(lc.getLongitude(), lc.getLatitude(), 0, p.getElt(2));
}
double min = stats.getMin();
double max = stats.getMax();
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
data.setSymbol(new Circle().setSize(1.0));
data.setColorRamp(ramp);
Vector3 center = MathConversions.toVector3(vStats.getMean());
double halfExtent = 0;
for (Vector3 p : points) {
double dist = center.sep(p);
if (dist > halfExtent) halfExtent = dist;
}
ProjectionOrthographic p = new ProjectionOrthographic(
config.width(),
config.height(),
CoordConverters.convertToLatitudinal(
new VectorIJK(center.getElt(0), center.getElt(1), center.getElt(2))));
p.setRadius(Math.max(0.5, .6 / halfExtent));
canvas = new MapPlot(config, p);
canvas.drawAxes();
canvas.plot(data);
((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true);
canvas.drawColorBar(ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp)
.numTicks(5)
.tickFunction(StringFunctions.fixedFormat("%.3f"))
.build());
} else {
for (Vector3 p : points) {
stats.addValue(p.getElt(2));
vStats.add(p);
data.add(p.getElt(0), p.getElt(1), 0, p.getElt(2));
}
double min = stats.getMin();
double max = stats.getMax();
ColorRamp ramp = ColorRamp.create(TYPE.CBSPECTRAL, min, max).createReverse();
data.setSymbol(new Circle().setSize(1.0));
data.setColorRamp(ramp);
canvas = new DiscreteDataPlot(config);
AxisX xAxis = data.defaultXAxis("X");
AxisY yAxis = data.defaultYAxis("Y");
canvas.setAxes(xAxis, yAxis);
canvas.drawAxes();
canvas.plot(data);
canvas.drawColorBar(ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp)
.numTicks(5)
.tickFunction(StringFunctions.fixedFormat("%.3f"))
.build());
}
return canvas.getImage();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc("Format of input file. If not present format is inferred from inputFile extension.")
.build());
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(Option.builder("inllr")
.desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("outputFile")
.hasArg()
.desc(
"Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top "
+ "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert "
+ "from global to local:\n transformed = rotation.mxv(point.sub(translation))")
.build());
options.addOption(Option.builder("translate")
.hasArg()
.desc("Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas. "
+ "Default is to use centroid of input point cloud.")
.build());
options.addOption(Option.builder("plotXYZ")
.hasArg()
.desc("Plot X vs Y (in the local frame) colored by Z. " + "Argument is the name of PNG file to write.")
.build());
options.addOption(Option.builder("plotXYR")
.hasArg()
.desc("Plot X vs Y (in the local frame) colored by R. " + "Argument is the name of PNG file to write.")
.build());
options.addOption(Option.builder("slope")
.desc("Choose local coordinate frame such that Z points normal to the plane "
+ "and X points along the direction of steepest descent.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException {
TerrasaurTool defaultOBJ = new PointCloudToPlane();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr");
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(inFile);
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
pcfc.read(inFile, inLLR);
System.out.printf("%d points read from %s\n", pcfc.getPoints().GetNumberOfPoints(), inFile);
int halfSize = 0;
double groundSampleDistance = 0;
vtkPoints points = pcfc.getPoints();
PointCloudToPlane pctp = new PointCloudToPlane(points, halfSize, groundSampleDistance);
Vector3 translation;
if (cl.hasOption("translate")) {
translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
pctp.getGMU().setTranslation(translation.toArray());
}
pctp.getGMU().calculateTransformation();
List<Vector3> globalPts = new ArrayList<>();
for (int i = 0; i < points.GetNumberOfPoints(); i++) globalPts.add(new Vector3(points.GetPoint(i)));
double[][] transformation = pctp.getGMU().getTransformation();
StringBuilder sb = new StringBuilder(String.format(
"translation vector:\n%24.16e%24.16e%24.16e\n",
transformation[0][3], transformation[1][3], transformation[2][3]));
logger.info(sb.toString());
sb = new StringBuilder("rotation matrix:\n");
for (int i = 0; i < 3; i++)
sb.append(
String.format(
"%24.16e%24.16e%24.16e\n",
transformation[i][0], transformation[i][1], transformation[i][2]));
logger.info(sb.toString());
logger.info(sb.toString());
sb = new StringBuilder("rotation matrix:\n");
for (int i = 0; i < 3; i++)
sb.append(String.format(
"%24.16e%24.16e%24.16e\n", transformation[i][0], transformation[i][1], transformation[i][2]));
logger.info(sb.toString());
Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation());
translation = new Vector3(pctp.getGMU().getTranslation());
Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation());
translation = new Vector3(pctp.getGMU().getTranslation());
if (cl.hasOption("slope")) {
Vector3 z = rotation.xpose().mxv(new Vector3(0, 0, 1));
VectorStatistics vStats = new VectorStatistics();
for (Vector3 pt : globalPts) vStats.add(pt);
if (cl.hasOption("slope")) {
Vector3 z = rotation.xpose().mxv(new Vector3(0, 0, 1));
VectorStatistics vStats = new VectorStatistics();
for (Vector3 pt : globalPts) vStats.add(pt);
Vector3 r = MathConversions.toVector3(vStats.getMean());
Vector3 r = MathConversions.toVector3(vStats.getMean());
Vector3 y = r.cross(z).hat();
Vector3 x = y.cross(z).hat();
rotation = new Matrix33(x, y, z);
}
List<Vector3> localPts = new ArrayList<>();
for (Vector3 p : globalPts) localPts.add(rotation.mxv(p.sub(translation)));
VectorStatistics vStats = new VectorStatistics();
for (Vector3 localPt : localPts) vStats.add(localPt);
if (cl.hasOption("plotXYZ")) {
BufferedImage image = pctp.makePlot(localPts, "Z (height above plane)");
PlotCanvas.writeImage(cl.getOptionValue("plotXYZ"), image);
}
if (cl.hasOption("plotXYR")) {
// rotate but don't translate
List<Vector3> xyr = new ArrayList<>();
for (Vector3 p : globalPts) {
Vector3 v = rotation.mxv(p);
xyr.add(new Vector3(v.getElt(0), v.getElt(1), v.norm()));
}
BufferedImage image = pctp.makePlot(xyr, "R");
PlotCanvas.writeImage(cl.getOptionValue("plotXYR"), image);
}
logger.info("statistics on full set");
logger.info(vStats);
Vector3 mean = MathConversions.toVector3(vStats.getMean());
Vector3 std = MathConversions.toVector3(vStats.getStandardDeviation());
double scale = 5;
List<Double> minList = new ArrayList<>();
List<Double> maxList = new ArrayList<>();
for (int i = 0; i < 3; i++) {
minList.add(mean.getElt(i) - scale * std.getElt(i));
maxList.add(mean.getElt(i) + scale * std.getElt(i));
}
vStats = new VectorStatistics();
for (Vector3 v : localPts) {
boolean addThis = true;
for (int i = 0; i < 3; i++) {
if (v.getElt(i) < minList.get(i) || v.getElt(i) > maxList.get(i)) {
addThis = false;
break;
Vector3 y = r.cross(z).hat();
Vector3 x = y.cross(z).hat();
rotation = new Matrix33(x, y, z);
}
}
if (addThis) vStats.add(v);
List<Vector3> localPts = new ArrayList<>();
for (Vector3 p : globalPts) localPts.add(rotation.mxv(p.sub(translation)));
VectorStatistics vStats = new VectorStatistics();
for (Vector3 localPt : localPts) vStats.add(localPt);
if (cl.hasOption("plotXYZ")) {
BufferedImage image = pctp.makePlot(localPts, "Z (height above plane)");
PlotCanvas.writeImage(cl.getOptionValue("plotXYZ"), image);
}
if (cl.hasOption("plotXYR")) {
// rotate but don't translate
List<Vector3> xyr = new ArrayList<>();
for (Vector3 p : globalPts) {
Vector3 v = rotation.mxv(p);
xyr.add(new Vector3(v.getElt(0), v.getElt(1), v.norm()));
}
BufferedImage image = pctp.makePlot(xyr, "R");
PlotCanvas.writeImage(cl.getOptionValue("plotXYR"), image);
}
logger.info("statistics on full set");
logger.info(vStats);
Vector3 mean = MathConversions.toVector3(vStats.getMean());
Vector3 std = MathConversions.toVector3(vStats.getStandardDeviation());
double scale = 5;
List<Double> minList = new ArrayList<>();
List<Double> maxList = new ArrayList<>();
for (int i = 0; i < 3; i++) {
minList.add(mean.getElt(i) - scale * std.getElt(i));
maxList.add(mean.getElt(i) + scale * std.getElt(i));
}
vStats = new VectorStatistics();
for (Vector3 v : localPts) {
boolean addThis = true;
for (int i = 0; i < 3; i++) {
if (v.getElt(i) < minList.get(i) || v.getElt(i) > maxList.get(i)) {
addThis = false;
break;
}
}
if (addThis) vStats.add(v);
}
logger.info("statistics on set without points more than 5 standard deviations from the mean:");
logger.info(vStats);
if (cl.hasOption("outputFile")) pctp.writeOutput(cl.getOptionValue("outputFile"));
}
logger.info("statistics on set without points more than 5 standard deviations from the mean:");
logger.info(vStats);
if (cl.hasOption("outputFile")) pctp.writeOutput(cl.getOptionValue("outputFile"));
}
}

View File

@@ -24,7 +24,6 @@ package terrasaur.apps;
import java.util.ArrayList;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
@@ -45,50 +44,53 @@ import vtk.vtkPolyData;
*/
public class PrintShapeModelStatistics implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
private PrintShapeModelStatistics() {}
private PrintShapeModelStatistics() {}
@Override
public String shortDescription() {
return "Print statistics about a shape model.";
}
@Override
public String fullDescription(Options options) {
String header = "This program prints various statistics about a shape model in OBJ format.";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("objFile").required().hasArg().desc("Path to OBJ file.").build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new PrintShapeModelStatistics();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
String filename = cl.getOptionValue("objFile");
NativeLibraryLoader.loadVtkLibraries();
vtkPolyData polydata = PolyDataUtil.loadShapeModelAndComputeNormals(filename);
PolyDataStatistics stat = new PolyDataStatistics(polydata);
ArrayList<String> stats = stat.getShapeModelStats();
for (String line : stats) {
logger.info(line);
@Override
public String shortDescription() {
return "Print statistics about a shape model.";
}
@Override
public String fullDescription(Options options) {
String header = "This program prints various statistics about a shape model in OBJ format.";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Path to OBJ file.")
.build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new PrintShapeModelStatistics();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
String filename = cl.getOptionValue("objFile");
NativeLibraryLoader.loadVtkLibraries();
vtkPolyData polydata = PolyDataUtil.loadShapeModelAndComputeNormals(filename);
PolyDataStatistics stat = new PolyDataStatistics(polydata);
ArrayList<String> stats = stat.getShapeModelStats();
for (String line : stats) {
logger.info(line);
}
}
}
}

View File

@@ -46,384 +46,375 @@ import vtk.vtkIdList;
import vtk.vtkPolyData;
public class RangeFromSumFile implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Calculate range to surface from a sumfile.";
}
@Override
public String shortDescription() {
return "Calculate range to surface from a sumfile.";
}
@Override
public String fullDescription(Options options) {
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
String header = "";
String footer =
"""
This program reads a sumfile along with a shape model and \
calculates the range to the surface. NOTE: Spacecraft position is \
assumed to be in kilometers. If not, use the -distanceScale option \
to convert to km.
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private SumFile sumFile;
private vtkPolyData polyData;
private SmallBodyModel smallBodyModel;
private int xOffset;
private int yOffset;
private long facet;
private Vector3D scPos;
private Vector3D sunXYZ;
private Vector3D surfaceIntercept;
private double tiltDeg;
private double tiltDir;
private double incidence;
private double emission;
private double phase;
private double scAzimuth;
private double scElevation;
private double sunAzimuth;
private double sunElevation;
private DescriptiveStatistics stats;
private double centerX, centerY;
public DescriptiveStatistics getStats() {
return stats;
}
private RangeFromSumFile() {}
public RangeFromSumFile(SumFile sumFile, vtkPolyData polyData) {
this.sumFile = sumFile;
int nPixelsX = sumFile.imageWidth();
int nPixelsY = sumFile.imageHeight();
centerX = 0.5 * (nPixelsX - 1);
centerY = 0.5 * (nPixelsY - 1);
this.polyData = polyData;
smallBodyModel = new SmallBodyModel(polyData);
scPos = sumFile.scobj().negate();
sunXYZ = sumFile.sunDirection();
stats = new DescriptiveStatistics();
}
public void setDistanceScale(double distanceScale) {
this.scPos = sumFile.scobj().scalarMultiply(distanceScale).negate();
}
/**
* @param xOffset x offset in pixels
* @param yOffset y offset in pixels
* @return key is cell index, value is surface intercept for the desired pixel offset from the center of the image.
*/
public Map.Entry<Long, Vector3D> findIntercept(int xOffset, int yOffset) {
this.xOffset = xOffset;
this.yOffset = yOffset;
Vector3D lookDir = new Vector3D(1.0, sumFile.boresight());
if (xOffset != 0) {
Vector3D offset = new Vector3D(-xOffset, sumFile.xPerPixel());
lookDir = lookDir.add(offset);
return TerrasaurTool.super.fullDescription(options, header, footer);
}
if (yOffset != 0) {
Vector3D offset = new Vector3D(-yOffset, sumFile.yPerPixel());
lookDir = lookDir.add(offset);
private SumFile sumFile;
private vtkPolyData polyData;
private SmallBodyModel smallBodyModel;
private int xOffset;
private int yOffset;
private long facet;
private Vector3D scPos;
private Vector3D sunXYZ;
private Vector3D surfaceIntercept;
private double tiltDeg;
private double tiltDir;
private double incidence;
private double emission;
private double phase;
private double scAzimuth;
private double scElevation;
private double sunAzimuth;
private double sunElevation;
private DescriptiveStatistics stats;
private double centerX, centerY;
public DescriptiveStatistics getStats() {
return stats;
}
double[] tmp = new double[3];
facet = smallBodyModel.computeRayIntersection(scPos.toArray(), lookDir.toArray(), tmp);
private RangeFromSumFile() {}
if (facet == -1) {
surfaceIntercept = null;
} else {
surfaceIntercept = new Vector3D(tmp);
public RangeFromSumFile(SumFile sumFile, vtkPolyData polyData) {
vtkIdList idList = new vtkIdList();
double[] pt0 = new double[3];
double[] pt1 = new double[3];
double[] pt2 = new double[3];
this.sumFile = sumFile;
polyData.GetCellPoints(facet, idList);
int nPixelsX = sumFile.imageWidth();
int nPixelsY = sumFile.imageHeight();
centerX = 0.5 * (nPixelsX - 1);
centerY = 0.5 * (nPixelsY - 1);
// get the ids for each point
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
this.polyData = polyData;
// get points that comprise the cell
polyData.GetPoint(id0, pt0);
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
smallBodyModel = new SmallBodyModel(polyData);
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
scPos = sumFile.scobj().negate();
sunXYZ = sumFile.sunDirection();
Vector3 center3 = MathConversions.toVector3(facet.getCenter());
Vector3D center3D = MathConversions.toVector3D(facet.getCenter());
Vector3 normal3 = MathConversions.toVector3(facet.getNormal());
Vector3D normal3D = MathConversions.toVector3D(facet.getNormal());
tiltDeg = Math.toDegrees(center3.sep(normal3));
if (tiltDeg > 90) tiltDeg = 180 - tiltDeg;
tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D);
incidence = Vector3D.angle(sunXYZ, normal3D);
emission = Vector3D.angle(scPos, normal3D);
phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D));
try {
// scPos is in body fixed coordinates
Plane p = new Plane(normal3, center3);
Vector3 projectedNorth = p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3);
scAzimuth = projected.sep(projectedNorth);
if (projected.cross(projectedNorth).dot(center3) < 0) scAzimuth = 2 * Math.PI - scAzimuth;
scElevation = Math.PI / 2 - emission;
// sunXYZ is a unit vector pointing to the sun
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3)).sub(center3);
sunAzimuth = projected.sep(projectedNorth);
if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth;
sunElevation = Math.PI / 2 - incidence;
} catch (SpiceException e) {
logger.error(e.getLocalizedMessage(), e);
}
stats.addValue(scPos.distance(surfaceIntercept));
}
return new AbstractMap.SimpleEntry<>(facet, surfaceIntercept);
}
public String getHeader(String filename) {
StringBuffer sb = new StringBuffer();
sb.append("# x increases to the right and y increases down. Top left corner is 0, 0.\n");
sb.append(String.format("# %s\n", filename));
sb.append(String.format("%7s", "# x"));
sb.append(String.format("%7s", "y"));
sb.append(StringUtils.center("facet", 8));
sb.append(StringUtils.center("Tilt", 12));
sb.append(StringUtils.center("Tilt Dir", 12));
sb.append(StringUtils.center("s/c position XYZ", 36));
sb.append(StringUtils.center("surface intercept XYZ", 36));
sb.append(StringUtils.center("lon", 12));
sb.append(StringUtils.center("lat", 12));
sb.append(StringUtils.center("rad", 12));
sb.append(StringUtils.center("range", 12));
sb.append(StringUtils.center("inc", 12));
sb.append(StringUtils.center("ems", 12));
sb.append(StringUtils.center("phase", 12));
sb.append(StringUtils.center("s/c az", 12));
sb.append(StringUtils.center("s/c el", 12));
sb.append(StringUtils.center("sun az", 12));
sb.append(StringUtils.center("sun el", 12));
sb.append("\n");
sb.append(String.format("%7s", "# "));
sb.append(String.format("%7s", ""));
sb.append(String.format("%8s", ""));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(km)", 36));
sb.append(StringUtils.center("(km)", 36));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(km)", 12));
sb.append(StringUtils.center("(km)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
return sb.toString();
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
sb.append(String.format("%7.2f", xOffset + centerX));
sb.append(String.format("%7.2f", yOffset + centerY));
sb.append(String.format("%8d", facet));
sb.append(String.format("%12.6f", tiltDeg));
sb.append(String.format("%12.6f", tiltDir));
sb.append(String.format("%12.6f", scPos.getX()));
sb.append(String.format("%12.6f", scPos.getY()));
sb.append(String.format("%12.6f", scPos.getZ()));
sb.append(String.format("%12.6f", surfaceIntercept.getX()));
sb.append(String.format("%12.6f", surfaceIntercept.getY()));
sb.append(String.format("%12.6f", surfaceIntercept.getZ()));
double lon = Math.toDegrees(surfaceIntercept.getAlpha());
if (lon < 0) lon += 360;
sb.append(String.format("%12.6f", lon));
sb.append(String.format("%12.6f", Math.toDegrees(surfaceIntercept.getDelta())));
sb.append(String.format("%12.6f", surfaceIntercept.getNorm()));
sb.append(String.format("%12.6f", scPos.distance(surfaceIntercept)));
sb.append(String.format("%12.6f", Math.toDegrees(incidence)));
sb.append(String.format("%12.6f", Math.toDegrees(emission)));
sb.append(String.format("%12.6f", Math.toDegrees(phase)));
sb.append(String.format("%12.6f", Math.toDegrees(scAzimuth)));
sb.append(String.format("%12.6f", Math.toDegrees(scElevation)));
sb.append(String.format("%12.6f", Math.toDegrees(sunAzimuth)));
sb.append(String.format("%12.6f", Math.toDegrees(sunElevation)));
return sb.toString();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("sumFile")
.required()
.hasArg()
.desc("Required. Name of sum file to read.")
.build());
options.addOption(
Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(
Option.builder("pixelOffset")
.hasArg()
.desc(
"Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. "
+ "x increases to the right and y increases down.")
.build());
options.addOption(
Option.builder("xRange")
.hasArg()
.desc(
"Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). "
+ "For example -50,50,5.")
.build());
options.addOption(
Option.builder("yRange")
.hasArg()
.desc(
"Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). "
+ "For example -50,50,5.")
.build());
options.addOption(
Option.builder("radius")
.hasArg()
.desc(
"Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.")
.build());
options.addOption(
Option.builder("distanceScale")
.hasArg()
.desc(
"Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).")
.build());
options.addOption(
Option.builder("stats")
.desc("Print out statistics about range to all selected pixels.")
.build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new RangeFromSumFile();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
int xStart = 0;
int xStop = 1;
int xSpacing = 1;
int yStart = 0;
int yStop = 1;
int ySpacing = 1;
if (cl.hasOption("pixelOffset")) {
String[] parts = cl.getOptionValue("pixelOffset").split(",");
int x = Integer.parseInt(parts[0].trim());
int y = Integer.parseInt(parts[1].trim());
xStart = x;
xStop = x + 1;
yStart = y;
yStop = y + 1;
stats = new DescriptiveStatistics();
}
if (cl.hasOption("xRange")) {
String[] parts = cl.getOptionValue("xRange").split(",");
xStart = Integer.parseInt(parts[0].trim());
xStop = Integer.parseInt(parts[1].trim());
xSpacing = Integer.parseInt(parts[2].trim());
public void setDistanceScale(double distanceScale) {
this.scPos = sumFile.scobj().scalarMultiply(distanceScale).negate();
}
if (cl.hasOption("yRange")) {
String[] parts = cl.getOptionValue("yRange").split(",");
yStart = Integer.parseInt(parts[0].trim());
yStop = Integer.parseInt(parts[1].trim());
ySpacing = Integer.parseInt(parts[2].trim());
}
/**
* @param xOffset x offset in pixels
* @param yOffset y offset in pixels
* @return key is cell index, value is surface intercept for the desired pixel offset from the center of the image.
*/
public Map.Entry<Long, Vector3D> findIntercept(int xOffset, int yOffset) {
int checkRadius = 0;
if (cl.hasOption("radius")) {
checkRadius = (int) Math.round(Double.parseDouble(cl.getOptionValue("radius")));
xStart -= checkRadius;
xStop += checkRadius;
yStart -= checkRadius;
yStop += checkRadius;
}
this.xOffset = xOffset;
this.yOffset = yOffset;
String objFile = cl.getOptionValue("objFile");
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
Vector3D lookDir = new Vector3D(1.0, sumFile.boresight());
if (cl.hasOption("distanceScale"))
rfsf.setDistanceScale(Double.parseDouble(cl.getOptionValue("distanceScale")));
System.out.println(rfsf.getHeader(cl.getOptionValue("sumFile")));
for (int ix = xStart; ix < xStop; ix += xSpacing) {
for (int iy = yStart; iy < yStop; iy += ySpacing) {
if (checkRadius > 0) {
double midx = (xStart + xStop) / 2.;
double midy = (yStart + yStop) / 2.;
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius)
continue;
if (xOffset != 0) {
Vector3D offset = new Vector3D(-xOffset, sumFile.xPerPixel());
lookDir = lookDir.add(offset);
}
long cellID = rfsf.findIntercept(ix, iy).getKey();
if (cellID > -1) System.out.println(rfsf);
}
if (yOffset != 0) {
Vector3D offset = new Vector3D(-yOffset, sumFile.yPerPixel());
lookDir = lookDir.add(offset);
}
double[] tmp = new double[3];
facet = smallBodyModel.computeRayIntersection(scPos.toArray(), lookDir.toArray(), tmp);
if (facet == -1) {
surfaceIntercept = null;
} else {
surfaceIntercept = new Vector3D(tmp);
vtkIdList idList = new vtkIdList();
double[] pt0 = new double[3];
double[] pt1 = new double[3];
double[] pt2 = new double[3];
polyData.GetCellPoints(facet, idList);
// get the ids for each point
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
// get points that comprise the cell
polyData.GetPoint(id0, pt0);
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
Vector3 center3 = MathConversions.toVector3(facet.getCenter());
Vector3D center3D = MathConversions.toVector3D(facet.getCenter());
Vector3 normal3 = MathConversions.toVector3(facet.getNormal());
Vector3D normal3D = MathConversions.toVector3D(facet.getNormal());
tiltDeg = Math.toDegrees(center3.sep(normal3));
if (tiltDeg > 90) tiltDeg = 180 - tiltDeg;
tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D);
incidence = Vector3D.angle(sunXYZ, normal3D);
emission = Vector3D.angle(scPos, normal3D);
phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D));
try {
// scPos is in body fixed coordinates
Plane p = new Plane(normal3, center3);
Vector3 projectedNorth =
p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3);
scAzimuth = projected.sep(projectedNorth);
if (projected.cross(projectedNorth).dot(center3) < 0) scAzimuth = 2 * Math.PI - scAzimuth;
scElevation = Math.PI / 2 - emission;
// sunXYZ is a unit vector pointing to the sun
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3))
.sub(center3);
sunAzimuth = projected.sep(projectedNorth);
if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth;
sunElevation = Math.PI / 2 - incidence;
} catch (SpiceException e) {
logger.error(e.getLocalizedMessage(), e);
}
stats.addValue(scPos.distance(surfaceIntercept));
}
return new AbstractMap.SimpleEntry<>(facet, surfaceIntercept);
}
public String getHeader(String filename) {
StringBuffer sb = new StringBuffer();
sb.append("# x increases to the right and y increases down. Top left corner is 0, 0.\n");
sb.append(String.format("# %s\n", filename));
sb.append(String.format("%7s", "# x"));
sb.append(String.format("%7s", "y"));
sb.append(StringUtils.center("facet", 8));
sb.append(StringUtils.center("Tilt", 12));
sb.append(StringUtils.center("Tilt Dir", 12));
sb.append(StringUtils.center("s/c position XYZ", 36));
sb.append(StringUtils.center("surface intercept XYZ", 36));
sb.append(StringUtils.center("lon", 12));
sb.append(StringUtils.center("lat", 12));
sb.append(StringUtils.center("rad", 12));
sb.append(StringUtils.center("range", 12));
sb.append(StringUtils.center("inc", 12));
sb.append(StringUtils.center("ems", 12));
sb.append(StringUtils.center("phase", 12));
sb.append(StringUtils.center("s/c az", 12));
sb.append(StringUtils.center("s/c el", 12));
sb.append(StringUtils.center("sun az", 12));
sb.append(StringUtils.center("sun el", 12));
sb.append("\n");
sb.append(String.format("%7s", "# "));
sb.append(String.format("%7s", ""));
sb.append(String.format("%8s", ""));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(km)", 36));
sb.append(StringUtils.center("(km)", 36));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(km)", 12));
sb.append(StringUtils.center("(km)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
sb.append(StringUtils.center("(deg)", 12));
return sb.toString();
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
sb.append(String.format("%7.2f", xOffset + centerX));
sb.append(String.format("%7.2f", yOffset + centerY));
sb.append(String.format("%8d", facet));
sb.append(String.format("%12.6f", tiltDeg));
sb.append(String.format("%12.6f", tiltDir));
sb.append(String.format("%12.6f", scPos.getX()));
sb.append(String.format("%12.6f", scPos.getY()));
sb.append(String.format("%12.6f", scPos.getZ()));
sb.append(String.format("%12.6f", surfaceIntercept.getX()));
sb.append(String.format("%12.6f", surfaceIntercept.getY()));
sb.append(String.format("%12.6f", surfaceIntercept.getZ()));
double lon = Math.toDegrees(surfaceIntercept.getAlpha());
if (lon < 0) lon += 360;
sb.append(String.format("%12.6f", lon));
sb.append(String.format("%12.6f", Math.toDegrees(surfaceIntercept.getDelta())));
sb.append(String.format("%12.6f", surfaceIntercept.getNorm()));
sb.append(String.format("%12.6f", scPos.distance(surfaceIntercept)));
sb.append(String.format("%12.6f", Math.toDegrees(incidence)));
sb.append(String.format("%12.6f", Math.toDegrees(emission)));
sb.append(String.format("%12.6f", Math.toDegrees(phase)));
sb.append(String.format("%12.6f", Math.toDegrees(scAzimuth)));
sb.append(String.format("%12.6f", Math.toDegrees(scElevation)));
sb.append(String.format("%12.6f", Math.toDegrees(sunAzimuth)));
sb.append(String.format("%12.6f", Math.toDegrees(sunElevation)));
return sb.toString();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("sumFile")
.required()
.hasArg()
.desc("Required. Name of sum file to read.")
.build());
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(Option.builder("pixelOffset")
.hasArg()
.desc(
"Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. "
+ "x increases to the right and y increases down.")
.build());
options.addOption(Option.builder("xRange")
.hasArg()
.desc(
"Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). "
+ "For example -50,50,5.")
.build());
options.addOption(Option.builder("yRange")
.hasArg()
.desc(
"Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). "
+ "For example -50,50,5.")
.build());
options.addOption(Option.builder("radius")
.hasArg()
.desc(
"Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.")
.build());
options.addOption(Option.builder("distanceScale")
.hasArg()
.desc(
"Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).")
.build());
options.addOption(Option.builder("stats")
.desc("Print out statistics about range to all selected pixels.")
.build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new RangeFromSumFile();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
SumFile sumFile = SumFile.fromFile(new File(cl.getOptionValue("sumFile")));
int xStart = 0;
int xStop = 1;
int xSpacing = 1;
int yStart = 0;
int yStop = 1;
int ySpacing = 1;
if (cl.hasOption("pixelOffset")) {
String[] parts = cl.getOptionValue("pixelOffset").split(",");
int x = Integer.parseInt(parts[0].trim());
int y = Integer.parseInt(parts[1].trim());
xStart = x;
xStop = x + 1;
yStart = y;
yStop = y + 1;
}
if (cl.hasOption("xRange")) {
String[] parts = cl.getOptionValue("xRange").split(",");
xStart = Integer.parseInt(parts[0].trim());
xStop = Integer.parseInt(parts[1].trim());
xSpacing = Integer.parseInt(parts[2].trim());
}
if (cl.hasOption("yRange")) {
String[] parts = cl.getOptionValue("yRange").split(",");
yStart = Integer.parseInt(parts[0].trim());
yStop = Integer.parseInt(parts[1].trim());
ySpacing = Integer.parseInt(parts[2].trim());
}
int checkRadius = 0;
if (cl.hasOption("radius")) {
checkRadius = (int) Math.round(Double.parseDouble(cl.getOptionValue("radius")));
xStart -= checkRadius;
xStop += checkRadius;
yStart -= checkRadius;
yStop += checkRadius;
}
String objFile = cl.getOptionValue("objFile");
vtkPolyData polyData = PolyDataUtil.loadShapeModel(objFile);
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
if (cl.hasOption("distanceScale"))
rfsf.setDistanceScale(Double.parseDouble(cl.getOptionValue("distanceScale")));
System.out.println(rfsf.getHeader(cl.getOptionValue("sumFile")));
for (int ix = xStart; ix < xStop; ix += xSpacing) {
for (int iy = yStart; iy < yStop; iy += ySpacing) {
if (checkRadius > 0) {
double midx = (xStart + xStop) / 2.;
double midy = (yStart + yStop) / 2.;
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius) continue;
}
long cellID = rfsf.findIntercept(ix, iy).getKey();
if (cellID > -1) System.out.println(rfsf);
}
}
if (cl.hasOption("stats")) System.out.println("Range " + rfsf.getStats());
}
if (cl.hasOption("stats")) System.out.println("Range " + rfsf.getStats());
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -42,201 +42,228 @@ import terrasaur.templates.TerrasaurTool;
public class RotationConversion implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Convert rotations between different types.";
}
@Override
public String shortDescription() {
return "Convert rotations between different types.";
}
@Override
public String fullDescription(Options options) {
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
String header = "";
String footer =
"""
This program converts rotations between angle and axis, 3x3 matrix, quaternions, \
and ZXZ rotation Euler angles. Note that the rotation modifies the frame; \
the vector is considered to be fixed. To find the rotation that modifies the \
vector in a fixed frame, take the transpose of this matrix.
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("angle").hasArg().desc("Rotation angle, in radians.").build());
options.addOption(
Option.builder("axis0").hasArg().desc("First element of rotation axis.").build());
options.addOption(
Option.builder("axis1").hasArg().desc("Second element of rotation axis.").build());
options.addOption(
Option.builder("axis2").hasArg().desc("Third element of rotation axis.").build());
options.addOption(Option.builder("cardanXYZ1").hasArg()
.desc("Cardan angle for the first rotation (about the X axis) in radians.").build());
options.addOption(Option.builder("cardanXYZ2").hasArg()
.desc("Cardan angle for the second rotation (about the Y axis) in radians.").build());
options.addOption(Option.builder("cardanXYZ3").hasArg()
.desc("Cardan angle for the third rotation (about the Z axis) in radians.").build());
options.addOption(Option.builder("eulerZXZ1").hasArg()
.desc("Euler angle for the first rotation (about the Z axis) in radians.").build());
options.addOption(Option.builder("eulerZXZ2").hasArg()
.desc("Euler angle for the second rotation (about the rotated X axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ3").hasArg()
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.").build());
options.addOption(
Option.builder("q0").hasArg().desc("Scalar term for quaternion: cos(theta/2)").build());
options.addOption(Option.builder("q1").hasArg()
.desc("First vector term for quaternion: sin(theta/2) * V[0]").build());
options.addOption(Option.builder("q2").hasArg()
.desc("Second vector term for quaternion: sin(theta/2) * V[1]").build());
options.addOption(Option.builder("q3").hasArg()
.desc("Third vector term for quaternion: sin(theta/2) * V[2]").build());
options.addOption(Option.builder("matrix").hasArg()
.desc("name of file containing rotation matrix to convert to Euler angles. "
+ "Format is 3x3 array in plain text separated by white space.")
.build());
options.addOption(Option.builder("anglesInDegrees").desc(
"If present, input angles in degrees and print output angles in degrees. Default is false.")
.build()); return options;
}
public static void main(String[] args) throws Exception {
RotationConversion defaultOBJ = new RotationConversion();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean inDegrees = cl.hasOption("anglesInDegrees");
boolean axisAndAngle = cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1")
&& cl.hasOption("axis2");
boolean cardanXYZ =
cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
boolean eulerZXZ =
cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
boolean quaternion =
cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
boolean matrix = cl.hasOption("matrix");
if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) {
logger.warn(
"Must specify input rotation as axis and angle, Cardan or Euler angles, matrix, or quaternion.");
System.exit(0);
return TerrasaurTool.super.fullDescription(options, header, footer);
}
Rotation r = null;
if (matrix) {
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
double[][] m = new double[3][3];
for (int i = 0; i < 3; i++) {
String[] parts = lines.get(i).trim().split("\\s+");
for (int j = 0; j < 3; j++)
m[i][j] = Double.parseDouble(parts[j].trim());
}
r = new Rotation(m, 1e-10);
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("angle")
.hasArg()
.desc("Rotation angle, in radians.")
.build());
options.addOption(Option.builder("axis0")
.hasArg()
.desc("First element of rotation axis.")
.build());
options.addOption(Option.builder("axis1")
.hasArg()
.desc("Second element of rotation axis.")
.build());
options.addOption(Option.builder("axis2")
.hasArg()
.desc("Third element of rotation axis.")
.build());
options.addOption(Option.builder("cardanXYZ1")
.hasArg()
.desc("Cardan angle for the first rotation (about the X axis) in radians.")
.build());
options.addOption(Option.builder("cardanXYZ2")
.hasArg()
.desc("Cardan angle for the second rotation (about the Y axis) in radians.")
.build());
options.addOption(Option.builder("cardanXYZ3")
.hasArg()
.desc("Cardan angle for the third rotation (about the Z axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ1")
.hasArg()
.desc("Euler angle for the first rotation (about the Z axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ2")
.hasArg()
.desc("Euler angle for the second rotation (about the rotated X axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ3")
.hasArg()
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.")
.build());
options.addOption(Option.builder("q0")
.hasArg()
.desc("Scalar term for quaternion: cos(theta/2)")
.build());
options.addOption(Option.builder("q1")
.hasArg()
.desc("First vector term for quaternion: sin(theta/2) * V[0]")
.build());
options.addOption(Option.builder("q2")
.hasArg()
.desc("Second vector term for quaternion: sin(theta/2) * V[1]")
.build());
options.addOption(Option.builder("q3")
.hasArg()
.desc("Third vector term for quaternion: sin(theta/2) * V[2]")
.build());
options.addOption(Option.builder("matrix")
.hasArg()
.desc("name of file containing rotation matrix to convert to Euler angles. "
+ "Format is 3x3 array in plain text separated by white space.")
.build());
options.addOption(Option.builder("anglesInDegrees")
.desc("If present, input angles in degrees and print output angles in degrees. Default is false.")
.build());
return options;
}
if (axisAndAngle) {
double angle = Double.parseDouble(cl.getOptionValue("angle").trim());
if (inDegrees)
angle = Math.toRadians(angle);
r = new Rotation(
new Vector3D(Double.parseDouble(cl.getOptionValue("axis0").trim()),
Double.parseDouble(cl.getOptionValue("axis1").trim()),
Double.parseDouble(cl.getOptionValue("axis2").trim())),
angle, RotationConvention.FRAME_TRANSFORM);
public static void main(String[] args) throws Exception {
RotationConversion defaultOBJ = new RotationConversion();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean inDegrees = cl.hasOption("anglesInDegrees");
boolean axisAndAngle =
cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1") && cl.hasOption("axis2");
boolean cardanXYZ = cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
boolean eulerZXZ = cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
boolean quaternion = cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
boolean matrix = cl.hasOption("matrix");
if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) {
logger.warn(
"Must specify input rotation as axis and angle, Cardan or Euler angles, matrix, or quaternion.");
System.exit(0);
}
Rotation r = null;
if (matrix) {
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
double[][] m = new double[3][3];
for (int i = 0; i < 3; i++) {
String[] parts = lines.get(i).trim().split("\\s+");
for (int j = 0; j < 3; j++) m[i][j] = Double.parseDouble(parts[j].trim());
}
r = new Rotation(m, 1e-10);
}
if (axisAndAngle) {
double angle = Double.parseDouble(cl.getOptionValue("angle").trim());
if (inDegrees) angle = Math.toRadians(angle);
r = new Rotation(
new Vector3D(
Double.parseDouble(cl.getOptionValue("axis0").trim()),
Double.parseDouble(cl.getOptionValue("axis1").trim()),
Double.parseDouble(cl.getOptionValue("axis2").trim())),
angle,
RotationConvention.FRAME_TRANSFORM);
}
if (cardanXYZ) {
double angle1 = Double.parseDouble(cl.getOptionValue("cardanXYZ1").trim());
double angle2 = Double.parseDouble(cl.getOptionValue("cardanXYZ2").trim());
double angle3 = Double.parseDouble(cl.getOptionValue("cardanXYZ3").trim());
if (inDegrees) {
angle1 = Math.toRadians(angle1);
angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3);
}
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
}
if (eulerZXZ) {
double angle1 = Double.parseDouble(cl.getOptionValue("eulerZXZ1").trim());
double angle2 = Double.parseDouble(cl.getOptionValue("eulerZXZ2").trim());
double angle3 = Double.parseDouble(cl.getOptionValue("eulerZXZ3").trim());
if (inDegrees) {
angle1 = Math.toRadians(angle1);
angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3);
}
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
}
if (quaternion) {
r = new Rotation(
Double.parseDouble(cl.getOptionValue("q0").trim()),
Double.parseDouble(cl.getOptionValue("q1").trim()),
Double.parseDouble(cl.getOptionValue("q2").trim()),
Double.parseDouble(cl.getOptionValue("q3").trim()),
true);
}
double[][] m = r.getMatrix();
String matrixString = String.format(
"rotation matrix:\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e",
m[0][0], m[0][1], m[0][2], m[1][0], m[1][1], m[1][2], m[2][0], m[2][1], m[2][2]);
System.out.println(matrixString);
String axisAndAngleString = inDegrees
? String.format(
"angle (degrees), axis:\n%g, %s",
Math.toDegrees(r.getAngle()), r.getAxis(RotationConvention.FRAME_TRANSFORM))
: String.format(
"angle (radians), axis:\n%g, %s", r.getAngle(), r.getAxis(RotationConvention.FRAME_TRANSFORM));
System.out.println(axisAndAngleString);
try {
double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM);
String cardanString = inDegrees
? String.format(
"Cardan XYZ angles (degrees):\n%g, %g, %g",
Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
System.out.println(cardanString);
} catch (CardanEulerSingularityException e) {
System.out.println("Cardan angles: encountered singularity, cannot solve");
}
try {
double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM);
String eulerString = inDegrees
? String.format(
"Euler ZXZ angles (degrees):\n%g, %g, %g",
Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
System.out.println(eulerString);
} catch (CardanEulerSingularityException e) {
System.out.println("Euler angles: encountered singularity, cannot solve");
}
System.out.printf("Quaternion:\n%g, %g, %g, %g\n", r.getQ0(), r.getQ1(), r.getQ2(), r.getQ3());
}
if (cardanXYZ) {
double angle1 = Double.parseDouble(cl.getOptionValue("cardanXYZ1").trim());
double angle2 = Double.parseDouble(cl.getOptionValue("cardanXYZ2").trim());
double angle3 = Double.parseDouble(cl.getOptionValue("cardanXYZ3").trim());
if (inDegrees) {
angle1 = Math.toRadians(angle1);
angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3);
}
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2,
angle3);
}
if (eulerZXZ) {
double angle1 = Double.parseDouble(cl.getOptionValue("eulerZXZ1").trim());
double angle2 = Double.parseDouble(cl.getOptionValue("eulerZXZ2").trim());
double angle3 = Double.parseDouble(cl.getOptionValue("eulerZXZ3").trim());
if (inDegrees) {
angle1 = Math.toRadians(angle1);
angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3);
}
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2,
angle3);
}
if (quaternion) {
r = new Rotation(Double.parseDouble(cl.getOptionValue("q0").trim()),
Double.parseDouble(cl.getOptionValue("q1").trim()),
Double.parseDouble(cl.getOptionValue("q2").trim()),
Double.parseDouble(cl.getOptionValue("q3").trim()), true);
}
double[][] m = r.getMatrix();
String matrixString = String.format(
"rotation matrix:\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e\n%24.16e %24.16e %24.16e",
m[0][0], m[0][1], m[0][2], m[1][0], m[1][1], m[1][2], m[2][0], m[2][1], m[2][2]);
System.out.println(matrixString);
String axisAndAngleString = inDegrees
? String.format("angle (degrees), axis:\n%g, %s", Math.toDegrees(r.getAngle()),
r.getAxis(RotationConvention.FRAME_TRANSFORM))
: String.format("angle (radians), axis:\n%g, %s", r.getAngle(),
r.getAxis(RotationConvention.FRAME_TRANSFORM));
System.out.println(axisAndAngleString);
try {
double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM);
String cardanString = inDegrees
? String.format("Cardan XYZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]),
Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1],
angles[2]);
System.out.println(cardanString);
} catch (CardanEulerSingularityException e) {
System.out.println("Cardan angles: encountered singularity, cannot solve");
}
try {
double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM);
String eulerString = inDegrees
? String.format("Euler ZXZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]),
Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1],
angles[2]);
System.out.println(eulerString);
} catch (CardanEulerSingularityException e) {
System.out.println("Euler angles: encountered singularity, cannot solve");
}
System.out.printf("Quaternion:\n%g, %g, %g, %g\n", r.getQ0(), r.getQ1(), r.getQ2(), r.getQ3());
}
}

View File

@@ -44,7 +44,7 @@ import terrasaur.utils.math.MathConversions;
public class SPKFromSumFile implements TerrasaurTool {
private final static Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
@@ -54,7 +54,8 @@ public class SPKFromSumFile implements TerrasaurTool {
@Override
public String fullDescription(Options options) {
String header = "";
String footer = """
String footer =
"""
Given three or more sumfiles, fit a parabola to the spacecraft
trajectory in J2000 and create an input file for MKSPK.
""";
@@ -70,10 +71,11 @@ public class SPKFromSumFile implements TerrasaurTool {
private NavigableMap<Double, String> sumFilenames;
private UnwritableInterval interval;
private SPKFromSumFile(){}
private SPKFromSumFile() {}
private SPKFromSumFile(Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap,
double extend) throws SpiceException {
private SPKFromSumFile(
Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap, double extend)
throws SpiceException {
this.observer = observer;
this.target = target;
this.bodyFixed = bodyFixed;
@@ -102,8 +104,9 @@ public class SPKFromSumFile implements TerrasaurTool {
* @param velocityIsJ2000 if true, user-supplied velocity is in J2000 frame
* @return command to run MKSPK
*/
public String writeMKSPKFiles(String basename, List<String> comments, int degree, final Vector3 velocity,
boolean velocityIsJ2000) throws SpiceException {
public String writeMKSPKFiles(
String basename, List<String> comments, int degree, final Vector3 velocity, boolean velocityIsJ2000)
throws SpiceException {
String commentFile = basename + "-comments.txt";
String setupFile = basename + ".setup";
@@ -112,23 +115,27 @@ public class SPKFromSumFile implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(commentFile)) {
StringBuilder sb = new StringBuilder();
if (!comments.isEmpty()) {
for (String comment : comments)
sb.append(comment).append("\n");
for (String comment : comments) sb.append(comment).append("\n");
sb.append("\n");
}
sb.append(String.format("This SPK for %s was generated by fitting a parabola to each component of the " + "SCOBJ vector from " + "the following sumfiles:\n", target));
sb.append(String.format(
"This SPK for %s was generated by fitting a parabola to each component of the "
+ "SCOBJ vector from " + "the following sumfiles:\n",
target));
for (String sumFile : sumFilenames.values()) {
sb.append(String.format("\t%s\n", sumFile));
}
sb.append("The SCOBJ vector was transformed to J2000 and an aberration correction ");
sb.append(String.format("was applied to find the geometric position relative to %s before the parabola " + "fit. ", target.getName()));
sb.append(String.format("The period covered by this SPK is %s to %s.",
sb.append(String.format(
"was applied to find the geometric position relative to %s before the parabola " + "fit. ",
target.getName()));
sb.append(String.format(
"The period covered by this SPK is %s to %s.",
new TDBTime(interval.getBegin()).toUTCString("ISOC", 3),
new TDBTime(interval.getEnd()).toUTCString("ISOC", 3)));
String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n"))
pw.println(WordUtils.wrap(comment, 80));
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
@@ -196,23 +203,37 @@ public class SPKFromSumFile implements TerrasaurTool {
PolynomialFunction zPos = new PolynomialFunction(zCoeff);
PolynomialFunction zVel = zPos.polynomialDerivative();
logger.info("Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
observer.getName(), target.getName());
logger.info(
"Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
observer.getName(),
target.getName());
StringBuilder xMsg = new StringBuilder(String.format("X = %e ", xCoeff[0]));
StringBuilder yMsg = new StringBuilder(String.format("Y = %e ", yCoeff[0]));
StringBuilder zMsg = new StringBuilder(String.format("Z = %e ", zCoeff[0]));
for (int i = 1; i <= degree; i++) {
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(xCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(yCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(zCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(xCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(yCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(zCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
}
logger.info(xMsg);
logger.info(yMsg);
logger.info(zMsg);
logger.debug("");
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration " +
"correction=NONE for fit.");
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration "
+ "correction=NONE for fit.");
for (Double t : sumFiles.keySet()) {
TDBTime tdb = new TDBTime(t);
SumFile sumFile = sumFiles.get(t);
@@ -234,32 +255,34 @@ public class SPKFromSumFile implements TerrasaurTool {
double vx = -xVel.value(t);
double vy = -yVel.value(t);
double vz = -zVel.value(t);
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t),
-zPos.value(t), vx, vy, vz);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
} else {
Vector3 thisVelocity = new Vector3(velocity);
if (!velocityIsJ2000) {
TDBTime tdb = new TDBTime(t);
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
thisVelocity =
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
}
double vx = thisVelocity.getElt(0);
double vy = thisVelocity.getElt(1);
double vz = thisVelocity.getElt(2);
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t),
-zPos.value(t), vx, vy, vz);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
}
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
try (PrintWriter pw = new PrintWriter(basename + ".csv")) {
pw.println("# Note: fit quantities are without light time or aberration corrections");
pw.println("# SCOBJ");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x," +
" y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric " +
"J2000) x, y, z");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x,"
+ " y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric "
+ "J2000) x, y, z");
for (Double t : sumFiles.keySet()) {
SumFile sumFile = sumFiles.get(t);
pw.printf("%s,", sumFile.utcString());
@@ -300,8 +323,8 @@ public class SPKFromSumFile implements TerrasaurTool {
pw.println();
}
pw.println("\n# Velocity");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, " +
"y, z, Fit (J2000) x, y, z");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, "
+ "y, z, Fit (J2000) x, y, z");
for (Double t : sumFiles.keySet()) {
SumFile sumFile = sumFiles.get(t);
pw.printf("%s,", sumFile.utcString());
@@ -327,7 +350,8 @@ public class SPKFromSumFile implements TerrasaurTool {
if (velocity != null) {
Vector3 thisVelocity = new Vector3(velocity);
if (!velocityIsJ2000) {
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
thisVelocity =
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
}
double vx = thisVelocity.getElt(0);
double vy = thisVelocity.getElt(1);
@@ -335,8 +359,8 @@ public class SPKFromSumFile implements TerrasaurTool {
velJ2000 = new Vector3(vx, vy, vz);
}
StateVector stateJ2000 = new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)),
velJ2000);
StateVector stateJ2000 =
new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)), velJ2000);
velBodyFixed = j2000ToBodyFixed.mxv(stateJ2000).getVector3(1);
pw.printf("%s, ", velBodyFixed.getElt(0));
@@ -357,19 +381,39 @@ public class SPKFromSumFile implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("degree").hasArg().desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.").build());
options.addOption(Option.builder("extend").hasArg().desc("Extend SPK past the last sumFile by <arg> seconds. "
+ " Default is zero.").build());
options.addOption(Option.builder("frame").hasArg().desc("Name of body fixed frame. This will default to the "
+ "target's body fixed frame.").build());
options.addOption(Option.builder("logFile").hasArg().desc("If present, save screen output to log file.").build());
options.addOption(Option.builder("degree")
.hasArg()
.desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.")
.build());
options.addOption(Option.builder("extend")
.hasArg()
.desc("Extend SPK past the last sumFile by <arg> seconds. " + " Default is zero.")
.build());
options.addOption(Option.builder("frame")
.hasArg()
.desc("Name of body fixed frame. This will default to the " + "target's body fixed frame.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg().desc("If present, print messages above selected " +
"priority. Valid values are " + sb.toString().trim() + ". Default is INFO.").build());
options.addOption(Option.builder("observer").required().hasArg().desc("Required. SPICE ID for the observer.").build());
options.addOption(Option.builder("sumFile").hasArg().required().desc("""
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected " + "priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("observer")
.required()
.hasArg()
.desc("Required. SPICE ID for the observer.")
.build());
options.addOption(Option.builder("sumFile")
.hasArg()
.required()
.desc(
"""
File listing sumfiles to read. This is a text file,
one per line. You can include an optional weight
after each filename. The default weight is 1.0.
@@ -385,14 +429,31 @@ public class SPKFromSumFile implements TerrasaurTool {
D717506131G0.SUM
# Weight this last image less than the others
D717506132G0.SUM 0.25
""").build());
options.addOption(Option.builder("spice").required().hasArgs().desc("Required. SPICE metakernel file " +
"containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
+ "whitespace.").build());
options.addOption(Option.builder("target").required().hasArg().desc("Required. SPICE ID for the target.").build());
options.addOption(Option.builder("velocity").hasArgs().desc("Spacecraft velocity relative to target in the " + "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to " + "take the derivative of the fit position. Specify as three floating point values in km/sec," + "separated by whitespace.").build());
options.addOption(Option.builder("velocityJ2000").desc("If present, argument to -velocity is in J2000 frame. "
+ " Ignored if -velocity is not set.").build()); return options;
""")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArgs()
.desc("Required. SPICE metakernel file "
+ "containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
+ "whitespace.")
.build());
options.addOption(Option.builder("target")
.required()
.hasArg()
.desc("Required. SPICE ID for the target.")
.build());
options.addOption(Option.builder("velocity")
.hasArgs()
.desc("Spacecraft velocity relative to target in the "
+ "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to "
+ "take the derivative of the fit position. Specify as three floating point values in km/sec,"
+ "separated by whitespace.")
.build());
options.addOption(Option.builder("velocityJ2000")
.desc("If present, argument to -velocity is in J2000 frame. " + " Ignored if -velocity is not set.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException {
@@ -408,11 +469,9 @@ public class SPKFromSumFile implements TerrasaurTool {
NativeLibraryLoader.loadSpiceLibraries();
final double extend = cl.hasOption("extend") ? Double.parseDouble(cl.getOptionValue("extend")) : 0;
for (String kernel : cl.getOptionValues("spice"))
KernelDatabase.load(kernel);
for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
Body observer = new Body(cl.getOptionValue("observer"));
Body target = new Body(cl.getOptionValue("target"));
@@ -468,5 +527,4 @@ public class SPKFromSumFile implements TerrasaurTool {
logger.info("Finished.");
}
}

View File

@@ -55,476 +55,452 @@ import vtk.vtkPolyData;
public class ShapeFormatConverter implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Transform a shape model to a new coordinate system.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"This program will rotate, translate, and/or scale a shape model. It can additionally transform a "
+ "single point, a sum file, or an SBMT ellipse file. For a sum file, the SCOBJ vector is "
+ "transformed and the cx, cy, cz, and sz vectors are rotated. For SBMT ellipse files, only "
+ "center of the ellipse is transformed. The size, orientation, and all other fields in the "
+ "file are unchanged.";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private enum COORDTYPE {
LATLON,
XYZ,
POLYDATA
}
private enum FORMATS {
ICQ,
LLR,
OBJ,
PDS,
PLT,
PLY,
STL,
VTK,
FITS,
SUM,
SBMT
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("centerOfRotation")
.hasArg()
.desc(
"Subtract this point before applying rotation matrix, add back after. "
+ "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).")
.build());
options.addOption(
Option.builder("decimate")
.hasArg()
.desc(
"Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and the argument to -decimate is 0.90, "
+ "there will be approximately 10 facets after the decimation.")
.build());
options.addOption(
Option.builder("input")
.required()
.hasArg()
.desc(
"Required. Name of shape model to transform. Extension must be icq, fits, llr, obj, pds, plt, ply, sbmt, stl, sum, or vtk. "
+ "Alternately transform a single point using three floating point numbers separated "
+ "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. "
+ "Transformed point will be written to stdout in the same format as the input string.")
.build());
options.addOption(
Option.builder("inputFormat")
.hasArg()
.desc(
"Format of input file. If not present format will be inferred from inputFile extension.")
.build());
options.addOption(
Option.builder("output")
.hasArg()
.desc(
"Required for all but single point input. Name of transformed file. "
+ "Extension must be obj, plt, sbmt, stl, sum, or vtk.")
.build());
options.addOption(
Option.builder("outputFormat")
.hasArg()
.desc(
"Format of output file. If not present format will be inferred from outputFile extension.")
.build());
options.addOption(
Option.builder("register")
.hasArg()
.desc("Use SVD to transform input file to best align with register file.")
.build());
options.addOption(
Option.builder("rotate")
.hasArg()
.desc(
"Rotate surface points and spacecraft position. "
+ "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) "
+ "separated by commas.")
.build());
options.addOption(
Option.builder("rotateToPrincipalAxes")
.desc("Rotate body to align along its principal axes of inertia.")
.build());
options.addOption(
Option.builder("scale")
.hasArg()
.desc(
"Scale the shape model by <arg>. This can either be one value or three "
+ "separated by commas. One value scales all three axes uniformly, "
+ "three values scale the x, y, and z axes respectively. For example, "
+ "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the "
+ "y dimension by 0.25, the z dimension by 1.5.")
.build());
options.addOption(
Option.builder("translate")
.hasArg()
.desc(
"Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas.")
.build());
options.addOption(
Option.builder("translateToCenter")
.desc("Translate body so that its center of mass is at the origin.")
.build());
options.addOption(
Option.builder("transform")
.hasArg()
.desc(
"Translate and rotate surface points and spacecraft position. "
+ "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix "
+ "is the rotation matrix. The top three entries in the right hand column are the translation "
+ "vector. The bottom row is always 0 0 0 1.")
.build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new ShapeFormatConverter();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
String filename = cl.getOptionValue("input");
COORDTYPE coordType = COORDTYPE.POLYDATA;
vtkPolyData polydata = null;
SumFile sumFile = null;
List<SBMTEllipseRecord> sbmtEllipse = null;
String extension = null;
if (cl.hasOption("inputFormat")) {
try {
extension =
FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase()).name().toLowerCase();
} catch (IllegalArgumentException e) {
logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat"));
}
}
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
switch (extension) {
case "icq", "llr", "obj", "pds", "plt", "ply", "stl", "vtk" ->
polydata = PolyDataUtil.loadShapeModel(filename, extension);
case "fits" -> polydata = PolyDataUtil.loadFITShapeModel(filename);
case "sum" -> {
List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset());
sumFile = SumFile.fromLines(lines);
}
case "sbmt" -> {
sbmtEllipse = new ArrayList<>();
vtkPoints points = new vtkPoints();
polydata = new vtkPolyData();
polydata.SetPoints(points);
for (String line : FileUtils.readLines(new File(filename), Charset.defaultCharset())) {
SBMTEllipseRecord record = SBMTEllipseRecord.fromString(line);
sbmtEllipse.add(record);
points.InsertNextPoint(record.x(), record.y(), record.z());
}
}
default -> {
// Single point
String[] params = filename.split(",");
vtkPoints points = new vtkPoints();
polydata = new vtkPolyData();
polydata.SetPoints(points);
if (params.length == 2) {
double[] array =
new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim())))
.toArray();
points.InsertNextPoint(array);
coordType = COORDTYPE.LATLON;
} else if (params.length == 3) {
double[] array = new double[3];
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
points.InsertNextPoint(array);
coordType = COORDTYPE.XYZ;
} else {
logger.error(
"Can't read input shape model {} with format {}", filename, extension.toUpperCase());
System.exit(0);
}
}
@Override
public String shortDescription() {
return "Transform a shape model to a new coordinate system.";
}
if (cl.hasOption("decimate") && polydata != null) {
double reduction = Double.parseDouble(cl.getOptionValue("decimate"));
if (reduction < 0) {
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to zero.", reduction);
reduction = 0;
}
if (reduction > 1) {
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to one.", reduction);
reduction = 1;
}
PolyDataUtil.decimatePolyData(polydata, reduction);
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"This program will rotate, translate, and/or scale a shape model. It can additionally transform a "
+ "single point, a sum file, or an SBMT ellipse file. For a sum file, the SCOBJ vector is "
+ "transformed and the cx, cy, cz, and sz vectors are rotated. For SBMT ellipse files, only "
+ "center of the ellipse is transformed. The size, orientation, and all other fields in the "
+ "file are unchanged.";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
if (coordType == COORDTYPE.POLYDATA && !cl.hasOption("output")) {
logger.error(String.format("No output file specified for input file %s", filename));
System.exit(0);
private enum COORDTYPE {
LATLON,
XYZ,
POLYDATA
}
Vector3 centerOfRotation = null;
Matrix33 rotation = null;
Vector3 translation = null;
Vector3 scale = new Vector3(1., 1., 1.);
for (Option option : cl.getOptions()) {
if (option.getOpt().equals("centerOfRotation"))
centerOfRotation =
MathConversions.toVector3(
VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
if (option.getOpt().equals("rotate"))
rotation =
MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
if (option.getOpt().equals("scale")) {
String scaleString = cl.getOptionValue("scale");
if (scaleString.contains(",")) {
scale = MathConversions.toVector3(VectorUtils.stringToVector3D(scaleString));
} else {
scale = scale.scale(Double.parseDouble(scaleString));
}
}
if (option.getOpt().equals("translate"))
translation =
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
if (option.getOpt().equals("transform")) {
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("transform")), Charset.defaultCharset());
Pair<Vector3D, Rotation> pair = RotationUtils.stringToTransform(lines);
translation = MathConversions.toVector3(pair.getKey());
rotation = MathConversions.toMatrix33(pair.getValue());
}
private enum FORMATS {
ICQ,
LLR,
OBJ,
PDS,
PLT,
PLY,
STL,
VTK,
FITS,
SUM,
SBMT
}
if (cl.hasOption("rotateToPrincipalAxes")) {
if (polydata != null) {
PolyDataStatistics stats = new PolyDataStatistics(polydata);
if (stats.isClosed()) {
ArrayList<double[]> axes = stats.getPrincipalAxes();
// make X primary, Y secondary
rotation = new Matrix33(new Vector3(axes.get(0)), 1, new Vector3(axes.get(1)), 2);
} else {
logger.warn("Shape is not closed, cannot determine principal axes.");
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("centerOfRotation")
.hasArg()
.desc(
"Subtract this point before applying rotation matrix, add back after. "
+ "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).")
.build());
options.addOption(Option.builder("decimate")
.hasArg()
.desc("Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and the argument to -decimate is 0.90, "
+ "there will be approximately 10 facets after the decimation.")
.build());
options.addOption(Option.builder("input")
.required()
.hasArg()
.desc(
"Required. Name of shape model to transform. Extension must be icq, fits, llr, obj, pds, plt, ply, sbmt, stl, sum, or vtk. "
+ "Alternately transform a single point using three floating point numbers separated "
+ "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. "
+ "Transformed point will be written to stdout in the same format as the input string.")
.build());
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc("Format of input file. If not present format will be inferred from inputFile extension.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.desc("Required for all but single point input. Name of transformed file. "
+ "Extension must be obj, plt, sbmt, stl, sum, or vtk.")
.build());
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc("Format of output file. If not present format will be inferred from outputFile extension.")
.build());
options.addOption(Option.builder("register")
.hasArg()
.desc("Use SVD to transform input file to best align with register file.")
.build());
options.addOption(Option.builder("rotate")
.hasArg()
.desc("Rotate surface points and spacecraft position. "
+ "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) "
+ "separated by commas.")
.build());
options.addOption(Option.builder("rotateToPrincipalAxes")
.desc("Rotate body to align along its principal axes of inertia.")
.build());
options.addOption(Option.builder("scale")
.hasArg()
.desc("Scale the shape model by <arg>. This can either be one value or three "
+ "separated by commas. One value scales all three axes uniformly, "
+ "three values scale the x, y, and z axes respectively. For example, "
+ "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the "
+ "y dimension by 0.25, the z dimension by 1.5.")
.build());
options.addOption(Option.builder("translate")
.hasArg()
.desc("Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas.")
.build());
options.addOption(Option.builder("translateToCenter")
.desc("Translate body so that its center of mass is at the origin.")
.build());
options.addOption(Option.builder("transform")
.hasArg()
.desc("Translate and rotate surface points and spacecraft position. "
+ "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix "
+ "is the rotation matrix. The top three entries in the right hand column are the translation "
+ "vector. The bottom row is always 0 0 0 1.")
.build());
return options;
}
if (cl.hasOption("register")) {
String register = cl.getOptionValue("register");
vtkPolyData registeredPolydata = null;
extension = FilenameUtils.getExtension(register).toLowerCase();
if (extension.equals("llr")
|| extension.equals("obj")
|| extension.equals("pds")
|| extension.equals("plt")
|| extension.equals("ply")
|| extension.equals("stl")
|| extension.equals("vtk")) {
registeredPolydata = PolyDataUtil.loadShapeModelAndComputeNormals(register);
} else {
logger.error(String.format("Can't read input shape model for registration: %s", register));
System.exit(0);
}
public static void main(String[] args) throws Exception {
if (registeredPolydata != null) {
Vector3D centerA = PolyDataUtil.computePolyDataCentroid(polydata);
Vector3D centerB = PolyDataUtil.computePolyDataCentroid(registeredPolydata);
TerrasaurTool defaultOBJ = new ShapeFormatConverter();
vtkPoints points = polydata.GetPoints();
double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsA[i] = new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
points = registeredPolydata.GetPoints();
Options options = defineOptions();
if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) {
logger.error("registered polydata does not have the same number of points as input.");
System.exit(0);
}
CommandLine cl = defaultOBJ.parseArgs(args, options);
double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsB[i] = new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
double[][] H = new double[3][3];
for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) {
for (int i = 0; i < 3; i++) {
for (int j = 0; j < 3; j++) {
H[i][j] += pointsA[ii][i] * pointsB[ii][j];
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
String filename = cl.getOptionValue("input");
COORDTYPE coordType = COORDTYPE.POLYDATA;
vtkPolyData polydata = null;
SumFile sumFile = null;
List<SBMTEllipseRecord> sbmtEllipse = null;
String extension = null;
if (cl.hasOption("inputFormat")) {
try {
extension = FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
.name()
.toLowerCase();
} catch (IllegalArgumentException e) {
logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat"));
}
}
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
switch (extension) {
case "icq", "llr", "obj", "pds", "plt", "ply", "stl", "vtk" ->
polydata = PolyDataUtil.loadShapeModel(filename, extension);
case "fits" -> polydata = PolyDataUtil.loadFITShapeModel(filename);
case "sum" -> {
List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset());
sumFile = SumFile.fromLines(lines);
}
case "sbmt" -> {
sbmtEllipse = new ArrayList<>();
vtkPoints points = new vtkPoints();
polydata = new vtkPolyData();
polydata.SetPoints(points);
for (String line : FileUtils.readLines(new File(filename), Charset.defaultCharset())) {
SBMTEllipseRecord record = SBMTEllipseRecord.fromString(line);
sbmtEllipse.add(record);
points.InsertNextPoint(record.x(), record.y(), record.z());
}
}
default -> {
// Single point
String[] params = filename.split(",");
vtkPoints points = new vtkPoints();
polydata = new vtkPolyData();
polydata.SetPoints(points);
if (params.length == 2) {
double[] array = new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim())))
.toArray();
points.InsertNextPoint(array);
coordType = COORDTYPE.LATLON;
} else if (params.length == 3) {
double[] array = new double[3];
for (int i = 0; i < 3; i++) array[i] = Double.parseDouble(params[i].trim());
points.InsertNextPoint(array);
coordType = COORDTYPE.XYZ;
} else {
logger.error("Can't read input shape model {} with format {}", filename, extension.toUpperCase());
System.exit(0);
}
}
}
}
RealMatrix pointMatrix = new Array2DRowRealMatrix(H);
SingularValueDecomposition svd = new SingularValueDecomposition(pointMatrix);
RealMatrix uT = svd.getUT();
RealMatrix v = svd.getV();
RealMatrix R = v.multiply(uT);
if (new LUDecomposition(R).getDeterminant() < 0) {
for (int i = 0; i < 3; i++) {
R.multiplyEntry(i, 2, -1);
}
if (cl.hasOption("decimate") && polydata != null) {
double reduction = Double.parseDouble(cl.getOptionValue("decimate"));
if (reduction < 0) {
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to zero.", reduction);
reduction = 0;
}
if (reduction > 1) {
logger.printf(Level.WARN, "Argument to -decimate is %.f! Setting to one.", reduction);
reduction = 1;
}
PolyDataUtil.decimatePolyData(polydata, reduction);
}
rotation = new Matrix33(R.getData());
translation = MathConversions.toVector3(centerB);
translation = translation.sub(rotation.mxv(MathConversions.toVector3(centerA)));
}
}
if (sumFile != null) {
if (rotation != null && translation != null)
sumFile.transform(
MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
} else {
Vector3 center;
if (polydata.GetNumberOfPoints() > 1) {
PolyDataStatistics stats = new PolyDataStatistics(polydata);
center = new Vector3(stats.getCentroid());
} else {
center = new Vector3(polydata.GetPoint(0));
}
if (cl.hasOption("translateToCenter")) translation = center.negate();
double[] values = new double[3];
for (int j = 0; j < 3; j++) values[j] = center.getElt(j) * scale.getElt(j);
Vector3 scaledCenter = new Vector3(values);
vtkPoints points = polydata.GetPoints();
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3 thisPoint = new Vector3(points.GetPoint(i));
thisPoint = thisPoint.sub(center);
for (int j = 0; j < 3; j++) values[j] = thisPoint.getElt(j) * scale.getElt(j);
thisPoint = new Vector3(values);
thisPoint = thisPoint.add(scaledCenter);
if (rotation != null) {
if (centerOfRotation == null) centerOfRotation = new Vector3();
/*-
else {
System.out.printf("Center of rotation:\n%s\n", centerOfRotation.toString());
System.out.printf("-centerOfRotation %f,%f,%f\n", centerOfRotation.getElt(0),
centerOfRotation.getElt(1), centerOfRotation.getElt(2));
}
*/
thisPoint = rotation.mxv(thisPoint.sub(centerOfRotation)).add(centerOfRotation);
}
if (translation != null) thisPoint = thisPoint.add(translation);
points.SetPoint(i, thisPoint.toArray());
}
}
/*-
if (rotation != null) {
AxisAndAngle aaa = new AxisAndAngle(rotation);
System.out.printf("Rotation:\n%s\n", rotation.toString());
System.out.printf("-rotate %.5e,%.5e,%.5e,%.5e\n", Math.toDegrees(aaa.getAngle()),
aaa.getAxis().getElt(0), aaa.getAxis().getElt(1), aaa.getAxis().getElt(2));
}
if (translation != null) {
System.out.printf("Translation:\n%s\n", translation.toString());
System.out.printf("-translate %.5e,%.5e,%.5e\n", translation.getElt(0), translation.getElt(1),
translation.getElt(2));
}
*/
if (coordType == COORDTYPE.LATLON) {
double[] pt = new double[3];
polydata.GetPoint(0, pt);
Vector3D point = new Vector3D(pt);
double lon = Math.toDegrees(point.getAlpha());
if (lon < 0) lon += 360;
System.out.printf("%.16f,%.16f\n", Math.toDegrees(point.getDelta()), lon);
} else if (coordType == COORDTYPE.XYZ) {
double[] pt = new double[3];
polydata.GetPoint(0, pt);
System.out.printf("%.16f,%.16f,%.16f\n", pt[0], pt[1], pt[2]);
} else {
filename = cl.getOptionValue("output");
extension = null;
if (cl.hasOption("outputFormat")) {
try {
extension =
FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase()).name().toLowerCase();
} catch (IllegalArgumentException e) {
logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat"));
}
}
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
switch (extension) {
case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename);
case "obj" -> PolyDataUtil.saveShapeModelAsOBJ(polydata, filename);
case "plt" -> PolyDataUtil.saveShapeModelAsPLT(polydata, filename);
case "stl" -> PolyDataUtil.saveShapeModelAsSTL(polydata, filename);
case "sum" -> {
try (PrintWriter pw = new PrintWriter(filename)) {
pw.print(sumFile.toString());
}
}
case "sbmt" -> {
if (sbmtEllipse == null) {
logger.error("No input SBMT ellipse specified!");
if (coordType == COORDTYPE.POLYDATA && !cl.hasOption("output")) {
logger.error(String.format("No output file specified for input file %s", filename));
System.exit(0);
}
double[] pt = new double[3];
List<SBMTEllipseRecord> transformedRecords = new ArrayList<>();
for (SBMTEllipseRecord record : sbmtEllipse) {
}
Vector3 centerOfRotation = null;
Matrix33 rotation = null;
Vector3 translation = null;
Vector3 scale = new Vector3(1., 1., 1.);
for (Option option : cl.getOptions()) {
if (option.getOpt().equals("centerOfRotation"))
centerOfRotation =
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
if (option.getOpt().equals("rotate"))
rotation = MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
if (option.getOpt().equals("scale")) {
String scaleString = cl.getOptionValue("scale");
if (scaleString.contains(",")) {
scale = MathConversions.toVector3(VectorUtils.stringToVector3D(scaleString));
} else {
scale = scale.scale(Double.parseDouble(scaleString));
}
}
if (option.getOpt().equals("translate"))
translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
if (option.getOpt().equals("transform")) {
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("transform")), Charset.defaultCharset());
Pair<Vector3D, Rotation> pair = RotationUtils.stringToTransform(lines);
translation = MathConversions.toVector3(pair.getKey());
rotation = MathConversions.toMatrix33(pair.getValue());
}
}
if (cl.hasOption("rotateToPrincipalAxes")) {
if (polydata != null) {
PolyDataStatistics stats = new PolyDataStatistics(polydata);
if (stats.isClosed()) {
ArrayList<double[]> axes = stats.getPrincipalAxes();
// make X primary, Y secondary
rotation = new Matrix33(new Vector3(axes.get(0)), 1, new Vector3(axes.get(1)), 2);
} else {
logger.warn("Shape is not closed, cannot determine principal axes.");
}
}
}
if (cl.hasOption("register")) {
String register = cl.getOptionValue("register");
vtkPolyData registeredPolydata = null;
extension = FilenameUtils.getExtension(register).toLowerCase();
if (extension.equals("llr")
|| extension.equals("obj")
|| extension.equals("pds")
|| extension.equals("plt")
|| extension.equals("ply")
|| extension.equals("stl")
|| extension.equals("vtk")) {
registeredPolydata = PolyDataUtil.loadShapeModelAndComputeNormals(register);
} else {
logger.error(String.format("Can't read input shape model for registration: %s", register));
System.exit(0);
}
if (registeredPolydata != null) {
Vector3D centerA = PolyDataUtil.computePolyDataCentroid(polydata);
Vector3D centerB = PolyDataUtil.computePolyDataCentroid(registeredPolydata);
vtkPoints points = polydata.GetPoints();
double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsA[i] =
new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
points = registeredPolydata.GetPoints();
if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) {
logger.error("registered polydata does not have the same number of points as input.");
System.exit(0);
}
double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsB[i] =
new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
double[][] H = new double[3][3];
for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) {
for (int i = 0; i < 3; i++) {
for (int j = 0; j < 3; j++) {
H[i][j] += pointsA[ii][i] * pointsB[ii][j];
}
}
}
RealMatrix pointMatrix = new Array2DRowRealMatrix(H);
SingularValueDecomposition svd = new SingularValueDecomposition(pointMatrix);
RealMatrix uT = svd.getUT();
RealMatrix v = svd.getV();
RealMatrix R = v.multiply(uT);
if (new LUDecomposition(R).getDeterminant() < 0) {
for (int i = 0; i < 3; i++) {
R.multiplyEntry(i, 2, -1);
}
}
rotation = new Matrix33(R.getData());
translation = MathConversions.toVector3(centerB);
translation = translation.sub(rotation.mxv(MathConversions.toVector3(centerA)));
}
}
if (sumFile != null) {
if (rotation != null && translation != null)
sumFile.transform(MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
} else {
Vector3 center;
if (polydata.GetNumberOfPoints() > 1) {
PolyDataStatistics stats = new PolyDataStatistics(polydata);
center = new Vector3(stats.getCentroid());
} else {
center = new Vector3(polydata.GetPoint(0));
}
if (cl.hasOption("translateToCenter")) translation = center.negate();
double[] values = new double[3];
for (int j = 0; j < 3; j++) values[j] = center.getElt(j) * scale.getElt(j);
Vector3 scaledCenter = new Vector3(values);
vtkPoints points = polydata.GetPoints();
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3 thisPoint = new Vector3(points.GetPoint(i));
thisPoint = thisPoint.sub(center);
for (int j = 0; j < 3; j++) values[j] = thisPoint.getElt(j) * scale.getElt(j);
thisPoint = new Vector3(values);
thisPoint = thisPoint.add(scaledCenter);
if (rotation != null) {
if (centerOfRotation == null) centerOfRotation = new Vector3();
/*-
else {
System.out.printf("Center of rotation:\n%s\n", centerOfRotation.toString());
System.out.printf("-centerOfRotation %f,%f,%f\n", centerOfRotation.getElt(0),
centerOfRotation.getElt(1), centerOfRotation.getElt(2));
}
*/
thisPoint = rotation.mxv(thisPoint.sub(centerOfRotation)).add(centerOfRotation);
}
if (translation != null) thisPoint = thisPoint.add(translation);
points.SetPoint(i, thisPoint.toArray());
}
}
/*-
if (rotation != null) {
AxisAndAngle aaa = new AxisAndAngle(rotation);
System.out.printf("Rotation:\n%s\n", rotation.toString());
System.out.printf("-rotate %.5e,%.5e,%.5e,%.5e\n", Math.toDegrees(aaa.getAngle()),
aaa.getAxis().getElt(0), aaa.getAxis().getElt(1), aaa.getAxis().getElt(2));
}
if (translation != null) {
System.out.printf("Translation:\n%s\n", translation.toString());
System.out.printf("-translate %.5e,%.5e,%.5e\n", translation.getElt(0), translation.getElt(1),
translation.getElt(2));
}
*/
if (coordType == COORDTYPE.LATLON) {
double[] pt = new double[3];
polydata.GetPoint(0, pt);
Vector3D point = new Vector3D(pt);
double lon = Math.toDegrees(point.getAlpha());
if (lon < 0) lon += 360;
Builder builder = ImmutableSBMTEllipseRecord.builder().from(record);
builder.x(point.getX());
builder.y(point.getY());
builder.z(point.getZ());
builder.lat(Math.toDegrees(point.getDelta()));
builder.lon(lon);
builder.radius(point.getNorm());
System.out.printf("%.16f,%.16f\n", Math.toDegrees(point.getDelta()), lon);
} else if (coordType == COORDTYPE.XYZ) {
double[] pt = new double[3];
polydata.GetPoint(0, pt);
System.out.printf("%.16f,%.16f,%.16f\n", pt[0], pt[1], pt[2]);
} else {
filename = cl.getOptionValue("output");
extension = null;
if (cl.hasOption("outputFormat")) {
try {
extension = FORMATS.valueOf(
cl.getOptionValue("outputFormat").toUpperCase())
.name()
.toLowerCase();
} catch (IllegalArgumentException e) {
logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat"));
}
}
if (extension == null)
extension = FilenameUtils.getExtension(filename).toLowerCase();
transformedRecords.add(builder.build());
}
try (PrintWriter pw = new PrintWriter(filename)) {
for (SBMTEllipseRecord record : transformedRecords) pw.println(record.toString());
}
switch (extension) {
case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename);
case "obj" -> PolyDataUtil.saveShapeModelAsOBJ(polydata, filename);
case "plt" -> PolyDataUtil.saveShapeModelAsPLT(polydata, filename);
case "stl" -> PolyDataUtil.saveShapeModelAsSTL(polydata, filename);
case "sum" -> {
try (PrintWriter pw = new PrintWriter(filename)) {
pw.print(sumFile.toString());
}
}
case "sbmt" -> {
if (sbmtEllipse == null) {
logger.error("No input SBMT ellipse specified!");
System.exit(0);
}
double[] pt = new double[3];
List<SBMTEllipseRecord> transformedRecords = new ArrayList<>();
for (SBMTEllipseRecord record : sbmtEllipse) {
polydata.GetPoint(0, pt);
Vector3D point = new Vector3D(pt);
double lon = Math.toDegrees(point.getAlpha());
if (lon < 0) lon += 360;
Builder builder = ImmutableSBMTEllipseRecord.builder().from(record);
builder.x(point.getX());
builder.y(point.getY());
builder.z(point.getZ());
builder.lat(Math.toDegrees(point.getDelta()));
builder.lon(lon);
builder.radius(point.getNorm());
transformedRecords.add(builder.build());
}
try (PrintWriter pw = new PrintWriter(filename)) {
for (SBMTEllipseRecord record : transformedRecords) pw.println(record.toString());
}
}
default -> {
logger.error("Can't write output shape model {} with format {}", filename, extension.toUpperCase());
System.exit(0);
}
}
logger.info("Wrote {}", filename);
}
default -> {
logger.error(
"Can't write output shape model {} with format {}",
filename,
extension.toUpperCase());
System.exit(0);
}
}
logger.info("Wrote {}", filename);
}
}
}

View File

@@ -57,19 +57,19 @@ import terrasaur.utils.spice.SpiceBundle;
public class SumFilesFromFlyby implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Create sumfiles for a simplified flyby.";
}
@Override
public String shortDescription() {
return "Create sumfiles for a simplified flyby.";
}
@Override
public String fullDescription(Options options) {
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
String header = "";
String footer =
"""
This tool creates sumfiles at points along a straight line trajectory past a body to be imaged.
@@ -82,402 +82,366 @@ public class SumFilesFromFlyby implements TerrasaurTool {
Given these assumptions, the trajectory can be specified using closest approach distance and phase along with speed.
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private SumFile sumfile;
private Function<Double, Vector3D> scPosFunc;
private Function<Double, Vector3D> scVelFunc;
private SumFilesFromFlyby() {}
public SumFilesFromFlyby(SumFile sumfile, double distance, double phase, double speed) {
this.sumfile = sumfile;
// given phase angle p, closest approach point is (cos p, sin p)
Vector3D closestApproach =
new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
Vector3D velocity =
new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
/*-
* Assumptions:
*
* Sun lies along the X axis
* flyby is in the equatorial (XY) plane
*
*/
scPosFunc = t -> closestApproach.add(velocity.scalarMultiply(t));
scVelFunc = t -> velocity;
}
public SumFile getSumFile(double t) {
TimeConversion tc = TimeConversion.createUsingInternalConstants();
Builder builder = ImmutableSumFile.builder().from(sumfile);
double imageTime = t + tc.utcStringToTDB(sumfile.utcString());
builder.picnm(String.format("%s%d", sumfile.picnm(), (int) Math.round(imageTime)));
builder.utcString(tc.format("C").apply(imageTime));
Vector3D scPos = scPosFunc.apply(t);
builder.scobj(scPos.negate());
Rotation bodyFixedToCamera = RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K);
builder.cx(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_I));
builder.cy(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_J));
builder.cz(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_K));
builder.sz(Vector3D.PLUS_I.scalarMultiply(1e8));
SumFile s = builder.build();
logger.info(
"{}: S/C position {}, phase {}",
s.utcString(),
s.scobj().negate(),
Math.toDegrees(Vector3D.angle(s.scobj().negate(), s.sz())));
return s;
}
private String writeMSOPCKFiles(
String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
throws SpiceException {
File commentFile = new File(basename + "_msopck-comments.txt");
if (commentFile.exists()) commentFile.delete();
String setupFile = basename + "_msopck.setup";
String inputFile = basename + "_msopck.inp";
try (PrintWriter pw = new PrintWriter(commentFile)) {
String allComments = "";
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
return TerrasaurTool.super.fullDescription(options, header, footer);
}
File fk = bundle.findKernel(String.format(".*%s\\.tf", basename));
File lsk = bundle.findKernel(".*naif[0-9]{4}\\.tls");
private SumFile sumfile;
private Function<Double, Vector3D> scPosFunc;
private Function<Double, Vector3D> scVelFunc;
Map<String, String> map = new TreeMap<>();
map.put("LSK_FILE_NAME", "'" + lsk + "'");
map.put("MAKE_FAKE_SCLK", String.format("'%s.tsc'", basename));
map.put("CK_TYPE", "3");
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
map.put("INSTRUMENT_ID", Integer.toString(frameID));
map.put("REFERENCE_FRAME_NAME", "'J2000'");
map.put("FRAMES_FILE_NAME", "'" + fk.getPath() + "'");
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
map.put("INPUT_TIME_TYPE", "'UTC'");
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
private SumFilesFromFlyby() {}
try (PrintWriter pw = new PrintWriter(setupFile)) {
pw.println("\\begindata");
for (String key : map.keySet()) {
pw.printf("%s = %s\n", key, map.get(key));
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
public SumFilesFromFlyby(SumFile sumfile, double distance, double phase, double speed) {
this.sumfile = sumfile;
// given phase angle p, closest approach point is (cos p, sin p)
Vector3D closestApproach = new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
Vector3D velocity = new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
/*-
* Assumptions:
*
* Sun lies along the X axis
* flyby is in the equatorial (XY) plane
*
*/
scPosFunc = t -> closestApproach.add(velocity.scalarMultiply(t));
scVelFunc = t -> velocity;
}
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
public SumFile getSumFile(double t) {
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
TimeConversion tc = TimeConversion.createUsingInternalConstants();
for (UnwritableInterval interval : intervals) {
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
double imageTime = t + t0;
Builder builder = ImmutableSumFile.builder().from(sumfile);
double imageTime = t + tc.utcStringToTDB(sumfile.utcString());
builder.picnm(String.format("%s%d", sumfile.picnm(), (int) Math.round(imageTime)));
builder.utcString(tc.format("C").apply(imageTime));
Vector3D scPos = scPosFunc.apply(t);
SpiceQuaternion q =
new SpiceQuaternion(
MathConversions.toMatrix33(
RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
attitudeMap.put(imageTime, q);
}
builder.scobj(scPos.negate());
Rotation bodyFixedToCamera = RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K);
builder.cx(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_I));
builder.cy(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_J));
builder.cz(bodyFixedToCamera.applyInverseTo(Vector3D.PLUS_K));
builder.sz(Vector3D.PLUS_I.scalarMultiply(1e8));
SumFile s = builder.build();
logger.info(
"{}: S/C position {}, phase {}",
s.utcString(),
s.scobj().negate(),
Math.toDegrees(Vector3D.angle(s.scobj().negate(), s.sz())));
return s;
}
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
for (double t : attitudeMap.keySet()) {
SpiceQuaternion q = attitudeMap.get(t);
Vector3 v = q.getVector();
pw.printf(
"%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6),
q.getScalar(),
v.getElt(0),
v.getElt(1),
v.getElt(2));
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
private String writeMSOPCKFiles(String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
throws SpiceException {
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
}
File commentFile = new File(basename + "_msopck-comments.txt");
if (commentFile.exists()) commentFile.delete();
String setupFile = basename + "_msopck.setup";
String inputFile = basename + "_msopck.inp";
/**
* @param basename file basename
* @param intervals time intervals
* @param centerID NAIF id of center body
* @param bundle SPICE bundle
* @return command to run MKSPK
*/
private String writeMKSPKFiles(
String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
try (PrintWriter pw = new PrintWriter(commentFile)) {
String commentFile = basename + "_mkspk-comments.txt";
String setupFile = basename + "_mkspk.setup";
String inputFile = basename + "_mkspk.inp";
try (PrintWriter pw = new PrintWriter(commentFile)) {
String allComments = "";
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
File lsk = bundle.findKernel(".*naif[0-9]{4}.tls");
Map<String, String> map = new TreeMap<>();
map.put("INPUT_DATA_TYPE", "'STATES'");
map.put("OUTPUT_SPK_TYPE", "13"); // hermite polynomial, unevenly spaced in time
map.put("OBJECT_ID", "-999");
map.put("CENTER_ID", String.format("%d", centerID));
map.put("COMMENT_FILE", String.format("'%s'", commentFile));
map.put("REF_FRAME_NAME", "'J2000'");
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
map.put("DATA_ORDER", "'EPOCH X Y Z VX VY VZ'");
map.put("DATA_DELIMITER", "' '");
map.put("LEAPSECONDS_FILE", String.format("'%s'", lsk));
map.put("TIME_WRAPPER", "'# ETSECONDS'");
map.put("POLYNOM_DEGREE", "7");
map.put("SEGMENT_ID", "'SPK_STATES_13'");
map.put("LINES_PER_RECORD", "1");
try (PrintWriter pw = new PrintWriter(setupFile)) {
pw.println("\\begindata");
for (String key : map.keySet()) {
pw.printf("%s = %s\n", key, map.get(key));
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
try (PrintWriter pw = new PrintWriter(inputFile)) {
for (UnwritableInterval interval : intervals) {
for (double t = interval.getBegin();
t < interval.getEnd();
t += interval.getLength() / 100) {
Vector3D scPos = scPosFunc.apply(t);
Vector3D scVel = scVelFunc.apply(t);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t + t0,
scPos.getX(),
scPos.getY(),
scPos.getZ(),
scVel.getX(),
scVel.getY(),
scVel.getZ());
}
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
return String.format(
"mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("distance")
.hasArg()
.required()
.desc("Required. Flyby distance from body center in km.")
.build());
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("mk")
.hasArg()
.desc(
"Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.")
.build());
options.addOption(
Option.builder("mkspk")
.hasArg()
.desc(
"If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.")
.build());
options.addOption(
Option.builder("msopck")
.desc("If present, create input files for MSOPCK. This option requires -lsk.")
.build());
options.addOption(
Option.builder("phase")
.hasArg()
.required()
.desc("Required. Phase angle at closest approach in degrees.")
.build());
options.addOption(
Option.builder("speed")
.hasArg()
.required()
.desc("Required. Flyby speed at closest approach in km/s.")
.build());
options.addOption(
Option.builder("template")
.hasArg()
.required()
.desc(
"Required. An existing sumfile to use as a template. Camera parameters are taken from this "
+ "file, while camera position and orientation are calculated.")
.build());
options.addOption(
Option.builder("times")
.hasArgs()
.desc(
"If present, list of times separated by white space. In seconds, relative to closest approach.")
.build());
return options;
}
public static void main(String[] args) throws IOException, SpiceException {
TerrasaurTool defaultOBJ = new SumFilesFromFlyby();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
double phase = Double.parseDouble(cl.getOptionValue("phase"));
if (phase < 0 || phase > 180) {
logger.error("Phase angle {} out of range [0, 180]", phase);
System.exit(0);
}
String sumFileTemplate = cl.getOptionValue("template");
String base = FilenameUtils.getBaseName(sumFileTemplate);
String ext = FilenameUtils.getExtension(sumFileTemplate);
SumFilesFromFlyby app =
new SumFilesFromFlyby(
SumFile.fromFile(new File(sumFileTemplate)),
Double.parseDouble(cl.getOptionValue("distance")),
Math.toRadians(phase),
Double.parseDouble(cl.getOptionValue("speed")));
NavigableSet<Double> times = new TreeSet<>();
if (cl.hasOption("times")) {
for (String s : cl.getOptionValues("times")) times.add(Double.parseDouble(s));
} else times.add(0.);
SpiceBundle bundle = null;
if (cl.hasOption("mk")) {
NativeLibraryLoader.loadSpiceLibraries();
bundle =
new SpiceBundle.Builder()
.addMetakernels(Collections.singletonList(cl.getOptionValue("mk")))
.build();
KernelDatabase.load(cl.getOptionValue("mk"));
}
TimeConversion tc =
bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
for (double t : times) {
SumFile s = app.getSumFile(t);
try (PrintWriter pw =
new PrintWriter(
String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
pw.println(s);
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
}
if (cl.hasOption("mkspk")) {
if (bundle == null) {
logger.error("Need -mk to use -mkspk!");
} else {
IntervalSet.Builder builder = IntervalSet.builder();
for (Double t : times) {
Double next = times.higher(t);
if (next != null) builder.add(new UnwritableInterval(t, next));
}
int centerID = Integer.parseInt(cl.getOptionValue("mkspk"));
String command = app.writeMKSPKFiles(base, builder.build(), centerID, bundle);
logger.info("Command to create SPK:\n{}", command);
}
}
if (cl.hasOption("msopck")) {
if (bundle == null) {
logger.error("Need -mk to use -msopck!");
} else {
IntervalSet.Builder builder = IntervalSet.builder();
for (Double t : times) {
Double next = times.higher(t);
if (next != null) builder.add(new UnwritableInterval(t, next));
String allComments = "";
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
final int scID = -999;
final int frameID = scID * 1000;
File fk = bundle.findKernel(String.format(".*%s\\.tf", basename));
File lsk = bundle.findKernel(".*naif[0-9]{4}\\.tls");
File spacecraftFK = new File(String.format("%s.tf", base));
try (PrintWriter pw = new PrintWriter(spacecraftFK)) {
pw.println("\\begindata");
pw.printf("FRAME_%s_FIXED = %d\n", base, frameID);
pw.printf("FRAME_%d_NAME = '%s_FIXED'\n", frameID, base);
pw.printf("FRAME_%d_CLASS = 3\n", frameID);
pw.printf("FRAME_%d_CENTER = %d\n", frameID, scID);
pw.printf("FRAME_%d_CLASS_ID = %d\n", frameID, frameID);
pw.println("\\begintext");
Map<String, String> map = new TreeMap<>();
map.put("LSK_FILE_NAME", "'" + lsk + "'");
map.put("MAKE_FAKE_SCLK", String.format("'%s.tsc'", basename));
map.put("CK_TYPE", "3");
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
map.put("INSTRUMENT_ID", Integer.toString(frameID));
map.put("REFERENCE_FRAME_NAME", "'J2000'");
map.put("FRAMES_FILE_NAME", "'" + fk.getPath() + "'");
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
map.put("INPUT_TIME_TYPE", "'UTC'");
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
try (PrintWriter pw = new PrintWriter(setupFile)) {
pw.println("\\begindata");
for (String key : map.keySet()) {
pw.printf("%s = %s\n", key, map.get(key));
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
List<File> kernels = new ArrayList<>(bundle.getKernels());
kernels.add(spacecraftFK);
NavigableMap<Double, SpiceQuaternion> attitudeMap = new TreeMap<>();
bundle = new SpiceBundle.Builder().addKernelList(kernels).build();
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
File spacecraftSCLK = new File(String.format("%s.tsc", base));
if (spacecraftSCLK.exists()) spacecraftSCLK.delete();
for (UnwritableInterval interval : intervals) {
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
String command = app.writeMSOPCKFiles(base, builder.build(), frameID, bundle);
logger.info("Command to create SPK:\n{}", command);
}
double imageTime = t + t0;
Vector3D scPos = scPosFunc.apply(t);
SpiceQuaternion q = new SpiceQuaternion(
MathConversions.toMatrix33(RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
attitudeMap.put(imageTime, q);
}
}
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
for (double t : attitudeMap.keySet()) {
SpiceQuaternion q = attitudeMap.get(t);
Vector3 v = q.getVector();
pw.printf(
"%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6), q.getScalar(), v.getElt(0), v.getElt(1), v.getElt(2));
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
}
return String.format("msopck %s %s %s.bc", setupFile, inputFile, basename);
}
/**
* @param basename file basename
* @param intervals time intervals
* @param centerID NAIF id of center body
* @param bundle SPICE bundle
* @return command to run MKSPK
*/
private String writeMKSPKFiles(String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
String commentFile = basename + "_mkspk-comments.txt";
String setupFile = basename + "_mkspk.setup";
String inputFile = basename + "_mkspk.inp";
try (PrintWriter pw = new PrintWriter(commentFile)) {
String allComments = "";
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
File lsk = bundle.findKernel(".*naif[0-9]{4}.tls");
Map<String, String> map = new TreeMap<>();
map.put("INPUT_DATA_TYPE", "'STATES'");
map.put("OUTPUT_SPK_TYPE", "13"); // hermite polynomial, unevenly spaced in time
map.put("OBJECT_ID", "-999");
map.put("CENTER_ID", String.format("%d", centerID));
map.put("COMMENT_FILE", String.format("'%s'", commentFile));
map.put("REF_FRAME_NAME", "'J2000'");
map.put("PRODUCER_ID", "'Hari.Nair@jhuapl.edu'");
map.put("DATA_ORDER", "'EPOCH X Y Z VX VY VZ'");
map.put("DATA_DELIMITER", "' '");
map.put("LEAPSECONDS_FILE", String.format("'%s'", lsk));
map.put("TIME_WRAPPER", "'# ETSECONDS'");
map.put("POLYNOM_DEGREE", "7");
map.put("SEGMENT_ID", "'SPK_STATES_13'");
map.put("LINES_PER_RECORD", "1");
try (PrintWriter pw = new PrintWriter(setupFile)) {
pw.println("\\begindata");
for (String key : map.keySet()) {
pw.printf("%s = %s\n", key, map.get(key));
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
double t0 = bundle.getTimeConversion().utcStringToTDB(sumfile.utcString());
try (PrintWriter pw = new PrintWriter(inputFile)) {
for (UnwritableInterval interval : intervals) {
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
Vector3D scPos = scPosFunc.apply(t);
Vector3D scVel = scVelFunc.apply(t);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t + t0, scPos.getX(), scPos.getY(), scPos.getZ(), scVel.getX(), scVel.getY(), scVel.getZ());
}
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
return String.format("mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("distance")
.hasArg()
.required()
.desc("Required. Flyby distance from body center in km.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("mk")
.hasArg()
.desc(
"Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.")
.build());
options.addOption(Option.builder("mkspk")
.hasArg()
.desc(
"If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.")
.build());
options.addOption(Option.builder("msopck")
.desc("If present, create input files for MSOPCK. This option requires -lsk.")
.build());
options.addOption(Option.builder("phase")
.hasArg()
.required()
.desc("Required. Phase angle at closest approach in degrees.")
.build());
options.addOption(Option.builder("speed")
.hasArg()
.required()
.desc("Required. Flyby speed at closest approach in km/s.")
.build());
options.addOption(Option.builder("template")
.hasArg()
.required()
.desc("Required. An existing sumfile to use as a template. Camera parameters are taken from this "
+ "file, while camera position and orientation are calculated.")
.build());
options.addOption(Option.builder("times")
.hasArgs()
.desc("If present, list of times separated by white space. In seconds, relative to closest approach.")
.build());
return options;
}
public static void main(String[] args) throws IOException, SpiceException {
TerrasaurTool defaultOBJ = new SumFilesFromFlyby();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
double phase = Double.parseDouble(cl.getOptionValue("phase"));
if (phase < 0 || phase > 180) {
logger.error("Phase angle {} out of range [0, 180]", phase);
System.exit(0);
}
String sumFileTemplate = cl.getOptionValue("template");
String base = FilenameUtils.getBaseName(sumFileTemplate);
String ext = FilenameUtils.getExtension(sumFileTemplate);
SumFilesFromFlyby app = new SumFilesFromFlyby(
SumFile.fromFile(new File(sumFileTemplate)),
Double.parseDouble(cl.getOptionValue("distance")),
Math.toRadians(phase),
Double.parseDouble(cl.getOptionValue("speed")));
NavigableSet<Double> times = new TreeSet<>();
if (cl.hasOption("times")) {
for (String s : cl.getOptionValues("times")) times.add(Double.parseDouble(s));
} else times.add(0.);
SpiceBundle bundle = null;
if (cl.hasOption("mk")) {
NativeLibraryLoader.loadSpiceLibraries();
bundle = new SpiceBundle.Builder()
.addMetakernels(Collections.singletonList(cl.getOptionValue("mk")))
.build();
KernelDatabase.load(cl.getOptionValue("mk"));
}
TimeConversion tc = bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
for (double t : times) {
SumFile s = app.getSumFile(t);
try (PrintWriter pw =
new PrintWriter(String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
pw.println(s);
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
}
if (cl.hasOption("mkspk")) {
if (bundle == null) {
logger.error("Need -mk to use -mkspk!");
} else {
IntervalSet.Builder builder = IntervalSet.builder();
for (Double t : times) {
Double next = times.higher(t);
if (next != null) builder.add(new UnwritableInterval(t, next));
}
int centerID = Integer.parseInt(cl.getOptionValue("mkspk"));
String command = app.writeMKSPKFiles(base, builder.build(), centerID, bundle);
logger.info("Command to create SPK:\n{}", command);
}
}
if (cl.hasOption("msopck")) {
if (bundle == null) {
logger.error("Need -mk to use -msopck!");
} else {
IntervalSet.Builder builder = IntervalSet.builder();
for (Double t : times) {
Double next = times.higher(t);
if (next != null) builder.add(new UnwritableInterval(t, next));
}
final int scID = -999;
final int frameID = scID * 1000;
File spacecraftFK = new File(String.format("%s.tf", base));
try (PrintWriter pw = new PrintWriter(spacecraftFK)) {
pw.println("\\begindata");
pw.printf("FRAME_%s_FIXED = %d\n", base, frameID);
pw.printf("FRAME_%d_NAME = '%s_FIXED'\n", frameID, base);
pw.printf("FRAME_%d_CLASS = 3\n", frameID);
pw.printf("FRAME_%d_CENTER = %d\n", frameID, scID);
pw.printf("FRAME_%d_CLASS_ID = %d\n", frameID, frameID);
pw.println("\\begintext");
}
List<File> kernels = new ArrayList<>(bundle.getKernels());
kernels.add(spacecraftFK);
bundle = new SpiceBundle.Builder().addKernelList(kernels).build();
File spacecraftSCLK = new File(String.format("%s.tsc", base));
if (spacecraftSCLK.exists()) spacecraftSCLK.delete();
String command = app.writeMSOPCKFiles(base, builder.build(), frameID, bundle);
logger.info("Command to create SPK:\n{}", command);
}
}
}
}
}

View File

@@ -64,298 +64,299 @@ import terrasaur.utils.tessellation.FibonacciSphere;
*/
public class TileLookup implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Locate tiles on the unit sphere.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
/**
* Given the base database name and a tile number, return the path to that database file. For
* example:
*
* <pre>
* System.out.println(TileLookup.getDBName("/path/to/database/ola.db", 6));
* System.out.println(TileLookup.getDBName("./ola.db", 6));
* System.out.println(TileLookup.getDBName("ola.db", 6));
*
* /path/to/database/ola.6.db
* ./ola.6.db
* ./ola.6.db
* </pre>
*
* @param dbName basename for database (e.g. /path/to/database/ola.db)
* @param tile tile index (e.g. 6)
* @return path to database file (e.g. /path/to/database/ola.6.db)
*/
public static String getDBName(String dbName, int tile) {
String fullPath = FilenameUtils.getFullPath(dbName);
if (fullPath.trim().isEmpty()) fullPath = ".";
if (!fullPath.endsWith(File.separator)) fullPath += File.separator;
return String.format(
"%s%s.%d.%s",
fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("nTiles")
.hasArg()
.required()
.desc("Number of points covering the sphere.")
.build());
options.addOption(
Option.builder("printCoords")
.desc(
"Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.")
.build());
options.addOption(
Option.builder("printDistance")
.hasArg()
.desc(
"Print a table of points sorted by distance from the input point. "
+ "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.")
.build());
options.addOption(
Option.builder("printStats")
.desc(
"Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.")
.build());
options.addOption(
Option.builder("png")
.hasArg()
.desc(
"Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TileLookup();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
final int npts = Integer.parseInt(cl.getOptionValue("nTiles"));
FibonacciSphere fs = new FibonacciSphere(npts);
if (cl.hasOption("printCoords")) {
String header = String.format("%7s, %10s, %9s", "# index", "longitude", "latitude");
System.out.println(header);
// System.out.printf("%7s, %10s, %9s, %6s\n", "# index", "longitude", "latitude", "mapola");
for (int i = 0; i < npts; i++) {
LatitudinalVector lv = fs.getTileCenter(i);
double lon = Math.toDegrees(lv.getLongitude());
if (lon < 0) lon += 360;
double lat = Math.toDegrees(lv.getLatitude());
System.out.printf("%7d, %10.5f, %9.5f\n", i, lon, lat);
}
System.exit(0);
@Override
public String shortDescription() {
return "Locate tiles on the unit sphere.";
}
if (cl.hasOption("printStats")) {
System.out.println(
"Statistics on distances between each point and its nearest neighbor (degrees):");
System.out.println(fs.getDistanceStats());
System.exit(0);
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
if (cl.hasOption("printDistance")) {
String[] parts = cl.getOptionValue("printDistance").split(",");
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
LatitudinalVector lv = new LatitudinalVector(1, lat, lon);
NavigableMap<Double, Integer> distanceMap = fs.getDistanceMap(lv);
System.out.printf("%11s, %5s, %10s, %9s\n", "# distance", "index", "longitude", "latitude");
System.out.printf("%11s, %5s, %10s, %9s\n", "# (degrees)", "", "(degrees)", "(degrees)");
for (Double dist : distanceMap.keySet()) {
int index = distanceMap.get(dist);
lv = fs.getTileCenter(index);
System.out.printf(
"%11.5f, %5d, %10.5f, %9.5f\n",
Math.toDegrees(dist),
index,
Math.toDegrees(lv.getLongitude()),
Math.toDegrees(lv.getLatitude()));
}
System.exit(0);
/**
* Given the base database name and a tile number, return the path to that database file. For
* example:
*
* <pre>
* System.out.println(TileLookup.getDBName("/path/to/database/ola.db", 6));
* System.out.println(TileLookup.getDBName("./ola.db", 6));
* System.out.println(TileLookup.getDBName("ola.db", 6));
*
* /path/to/database/ola.6.db
* ./ola.6.db
* ./ola.6.db
* </pre>
*
* @param dbName basename for database (e.g. /path/to/database/ola.db)
* @param tile tile index (e.g. 6)
* @return path to database file (e.g. /path/to/database/ola.6.db)
*/
public static String getDBName(String dbName, int tile) {
String fullPath = FilenameUtils.getFullPath(dbName);
if (fullPath.trim().isEmpty()) fullPath = ".";
if (!fullPath.endsWith(File.separator)) fullPath += File.separator;
return String.format(
"%s%s.%d.%s", fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
}
if (cl.hasOption("png")) {
PlotConfig config = ImmutablePlotConfig.builder().width(1000).height(1000).build();
String title = String.format("Fibonacci Sphere, n = %d, ", npts);
Map<String, Projection> projections = new LinkedHashMap<>();
projections.put(
title + "Rectangular Projection",
new ProjectionRectangular(config.width(), config.height() / 2));
projections.put(
title + "Orthographic Projection (0, 90)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0)));
projections.put(
title + "Orthographic Projection (0, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
projections.put(
title + "Orthographic Projection (90, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2)));
projections.put(
title + "Orthographic Projection (180, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
projections.put(
title + "Orthographic Projection (270, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, 3 * Math.PI / 2)));
projections.put(
title + "Orthographic Projection (0, -90)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, -Math.PI / 2, 0)));
final int nColors = 6;
ColorRamp ramp = ColorRamp.createLinear(1, nColors - 1);
List<Color> colors = new ArrayList<>();
colors.add(Color.BLACK);
for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i));
colors.add(Color.WHITE);
ramp = ImmutableColorRamp.builder().min(0).max(nColors).colors(colors).build();
double radius = fs.getDistanceStats().getMax();
ramp = ColorRamp.createLinear(0, radius).addLimitColors();
ArrayList<BufferedImage> images = new ArrayList<>();
for (String t : projections.keySet()) {
config = ImmutablePlotConfig.builder().from(config).title(t).build();
Projection p = projections.get(t);
if (p instanceof ProjectionRectangular)
config = ImmutablePlotConfig.builder().from(config).height(500).build();
else config = ImmutablePlotConfig.builder().from(config).height(1000).build();
MapPlot canvas = new MapPlot(config, p);
AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE");
AxisY yLeftAxis = new AxisY(-90, 90, "Latitude (degrees)", "%.0f");
canvas.drawTitle();
canvas.setAxes(xLowerAxis, yLeftAxis);
// canvas.drawAxes();
BufferedImage image = canvas.getImage();
for (int i = 0; i < config.width(); i++) {
for (int j = 0; j < config.height(); j++) {
LatitudinalVector lv = p.pixelToSpherical(i, j);
if (lv == null) continue;
double closestDistance = Math.toDegrees(fs.getNearest(lv).getKey());
// int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size();
image.setRGB(
config.leftMargin() + i,
config.topMargin() + j,
ramp.getColor(closestDistance).getRGB());
}
}
DiscreteDataSet points = new DiscreteDataSet("");
for (int i = 0; i < fs.getNumTiles(); i++) {
LatitudinalVector lv = fs.getTileCenter(i);
points.add(lv.getLongitude(), lv.getLatitude());
}
if (p instanceof ProjectionRectangular) {
canvas.drawColorBar(
ImmutableColorBar.builder()
.rect(
new Rectangle(
canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
.ramp(ramp)
.numTicks(nColors + 1)
.tickFunction(aDouble -> String.format("%.1f", aDouble))
.build());
// for (int i = 0; i < fs.getNumTiles(); i++) {
// LatitudinalVector lv = fs.getTileCenter(i);
// canvas.drawCircle(lv, radius, Math.toRadians(1), Color.RED);
// }
}
for (int i = 0; i < fs.getNumTiles(); i++) {
LatitudinalVector lv = fs.getTileCenter(i);
canvas.addAnnotation(
ImmutableAnnotation.builder().text(String.format("%d", i)).build(),
lv.getLongitude(),
lv.getLatitude());
}
images.add(canvas.getImage());
}
int width = 2400;
int height = 2400;
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
Graphics2D g = image.createGraphics();
g.setRenderingHint(
RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
int imageWidth = width;
int imageHeight = height / 3;
g.drawImage(
images.getFirst(),
width / 6,
0,
5 * width / 6,
imageHeight,
0,
0,
images.getFirst().getWidth(),
images.getFirst().getHeight(),
null);
imageWidth = width / 3;
for (int i = 1; i < 4; i++) {
g.drawImage(
images.get(i),
(i - 1) * imageWidth,
imageHeight,
i * imageWidth,
2 * imageHeight,
0,
0,
images.get(i).getWidth(),
images.get(i).getHeight(),
null);
}
for (int i = 4; i < 7; i++) {
g.drawImage(
images.get(i),
(i - 4) * imageWidth,
2 * imageHeight,
(i - 3) * imageWidth,
3 * imageHeight,
0,
0,
images.get(i).getWidth(),
images.get(i).getHeight(),
null);
}
g.dispose();
PlotCanvas.writeImage(cl.getOptionValue("png"), image);
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("nTiles")
.hasArg()
.required()
.desc("Number of points covering the sphere.")
.build());
options.addOption(Option.builder("printCoords")
.desc(
"Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.")
.build());
options.addOption(Option.builder("printDistance")
.hasArg()
.desc(
"Print a table of points sorted by distance from the input point. "
+ "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.")
.build());
options.addOption(Option.builder("printStats")
.desc(
"Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.")
.build());
options.addOption(Option.builder("png")
.hasArg()
.desc(
"Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TileLookup();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
final int npts = Integer.parseInt(cl.getOptionValue("nTiles"));
FibonacciSphere fs = new FibonacciSphere(npts);
if (cl.hasOption("printCoords")) {
String header = String.format("%7s, %10s, %9s", "# index", "longitude", "latitude");
System.out.println(header);
// System.out.printf("%7s, %10s, %9s, %6s\n", "# index", "longitude", "latitude", "mapola");
for (int i = 0; i < npts; i++) {
LatitudinalVector lv = fs.getTileCenter(i);
double lon = Math.toDegrees(lv.getLongitude());
if (lon < 0) lon += 360;
double lat = Math.toDegrees(lv.getLatitude());
System.out.printf("%7d, %10.5f, %9.5f\n", i, lon, lat);
}
System.exit(0);
}
if (cl.hasOption("printStats")) {
System.out.println("Statistics on distances between each point and its nearest neighbor (degrees):");
System.out.println(fs.getDistanceStats());
System.exit(0);
}
if (cl.hasOption("printDistance")) {
String[] parts = cl.getOptionValue("printDistance").split(",");
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
LatitudinalVector lv = new LatitudinalVector(1, lat, lon);
NavigableMap<Double, Integer> distanceMap = fs.getDistanceMap(lv);
System.out.printf("%11s, %5s, %10s, %9s\n", "# distance", "index", "longitude", "latitude");
System.out.printf("%11s, %5s, %10s, %9s\n", "# (degrees)", "", "(degrees)", "(degrees)");
for (Double dist : distanceMap.keySet()) {
int index = distanceMap.get(dist);
lv = fs.getTileCenter(index);
System.out.printf(
"%11.5f, %5d, %10.5f, %9.5f\n",
Math.toDegrees(dist),
index,
Math.toDegrees(lv.getLongitude()),
Math.toDegrees(lv.getLatitude()));
}
System.exit(0);
}
if (cl.hasOption("png")) {
PlotConfig config =
ImmutablePlotConfig.builder().width(1000).height(1000).build();
String title = String.format("Fibonacci Sphere, n = %d, ", npts);
Map<String, Projection> projections = new LinkedHashMap<>();
projections.put(
title + "Rectangular Projection", new ProjectionRectangular(config.width(), config.height() / 2));
projections.put(
title + "Orthographic Projection (0, 90)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0)));
projections.put(
title + "Orthographic Projection (0, 0)",
new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
projections.put(
title + "Orthographic Projection (90, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2)));
projections.put(
title + "Orthographic Projection (180, 0)",
new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
projections.put(
title + "Orthographic Projection (270, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, 3 * Math.PI / 2)));
projections.put(
title + "Orthographic Projection (0, -90)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, -Math.PI / 2, 0)));
final int nColors = 6;
ColorRamp ramp = ColorRamp.createLinear(1, nColors - 1);
List<Color> colors = new ArrayList<>();
colors.add(Color.BLACK);
for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i));
colors.add(Color.WHITE);
ramp = ImmutableColorRamp.builder()
.min(0)
.max(nColors)
.colors(colors)
.build();
double radius = fs.getDistanceStats().getMax();
ramp = ColorRamp.createLinear(0, radius).addLimitColors();
ArrayList<BufferedImage> images = new ArrayList<>();
for (String t : projections.keySet()) {
config = ImmutablePlotConfig.builder().from(config).title(t).build();
Projection p = projections.get(t);
if (p instanceof ProjectionRectangular)
config = ImmutablePlotConfig.builder()
.from(config)
.height(500)
.build();
else
config = ImmutablePlotConfig.builder()
.from(config)
.height(1000)
.build();
MapPlot canvas = new MapPlot(config, p);
AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE");
AxisY yLeftAxis = new AxisY(-90, 90, "Latitude (degrees)", "%.0f");
canvas.drawTitle();
canvas.setAxes(xLowerAxis, yLeftAxis);
// canvas.drawAxes();
BufferedImage image = canvas.getImage();
for (int i = 0; i < config.width(); i++) {
for (int j = 0; j < config.height(); j++) {
LatitudinalVector lv = p.pixelToSpherical(i, j);
if (lv == null) continue;
double closestDistance =
Math.toDegrees(fs.getNearest(lv).getKey());
// int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size();
image.setRGB(
config.leftMargin() + i,
config.topMargin() + j,
ramp.getColor(closestDistance).getRGB());
}
}
DiscreteDataSet points = new DiscreteDataSet("");
for (int i = 0; i < fs.getNumTiles(); i++) {
LatitudinalVector lv = fs.getTileCenter(i);
points.add(lv.getLongitude(), lv.getLatitude());
}
if (p instanceof ProjectionRectangular) {
canvas.drawColorBar(ImmutableColorBar.builder()
.rect(new Rectangle(canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
.ramp(ramp)
.numTicks(nColors + 1)
.tickFunction(aDouble -> String.format("%.1f", aDouble))
.build());
// for (int i = 0; i < fs.getNumTiles(); i++) {
// LatitudinalVector lv = fs.getTileCenter(i);
// canvas.drawCircle(lv, radius, Math.toRadians(1), Color.RED);
// }
}
for (int i = 0; i < fs.getNumTiles(); i++) {
LatitudinalVector lv = fs.getTileCenter(i);
canvas.addAnnotation(
ImmutableAnnotation.builder()
.text(String.format("%d", i))
.build(),
lv.getLongitude(),
lv.getLatitude());
}
images.add(canvas.getImage());
}
int width = 2400;
int height = 2400;
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
Graphics2D g = image.createGraphics();
g.setRenderingHint(RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
int imageWidth = width;
int imageHeight = height / 3;
g.drawImage(
images.getFirst(),
width / 6,
0,
5 * width / 6,
imageHeight,
0,
0,
images.getFirst().getWidth(),
images.getFirst().getHeight(),
null);
imageWidth = width / 3;
for (int i = 1; i < 4; i++) {
g.drawImage(
images.get(i),
(i - 1) * imageWidth,
imageHeight,
i * imageWidth,
2 * imageHeight,
0,
0,
images.get(i).getWidth(),
images.get(i).getHeight(),
null);
}
for (int i = 4; i < 7; i++) {
g.drawImage(
images.get(i),
(i - 4) * imageWidth,
2 * imageHeight,
(i - 3) * imageWidth,
3 * imageHeight,
0,
0,
images.get(i).getWidth(),
images.get(i).getHeight(),
null);
}
g.dispose();
PlotCanvas.writeImage(cl.getOptionValue("png"), image);
}
}
}
}

View File

@@ -50,127 +50,140 @@ import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.SPICEUtil;
public class TransformFrame implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Transform coordinates between reference frames.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "\nThis program transforms coordinates between reference frames.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private NavigableMap<TDBTime, Vector3> pointsIn;
private NavigableMap<TDBTime, Vector3> pointsOut;
public TransformFrame() {}
public void setPoints(NavigableMap<TDBTime, Vector3> pointsIn) {
this.pointsIn = pointsIn;
}
public void transformCoordinates(String inFrame, String outFrame) {
try {
ReferenceFrame from = new ReferenceFrame(inFrame);
ReferenceFrame to = new ReferenceFrame(outFrame);
pointsOut = new TreeMap<>(SPICEUtil.tdbComparator);
for (TDBTime t : pointsIn.keySet()) {
Matrix33 transform = from.getPositionTransformation(to, t);
pointsOut.put(t, transform.mxv(pointsIn.get(t)));
}
} catch (SpiceException e) {
logger.error(e.getLocalizedMessage());
}
}
public void write(String outFile) {
try (PrintWriter pw = new PrintWriter(outFile)) {
for (TDBTime t : pointsOut.keySet()) {
Vector3 v = pointsOut.get(t);
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1),
v.getElt(2));
}
} catch (FileNotFoundException | SpiceException e) {
logger.error(e.getLocalizedMessage());
@Override
public String shortDescription() {
return "Transform coordinates between reference frames.";
}
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "\nThis program transforms coordinates between reference frames.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("inFile").required().hasArg()
.desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.")
.build());
options.addOption(Option.builder("inFrame").required().hasArg()
.desc("Required. Name of inFile reference frame.").build());
options.addOption(Option.builder("outFile").required().hasArg()
.desc("Required. Name of output file. It will be in the same format as inFile.").build());
options.addOption(Option.builder("outFrame").required().hasArg()
.desc("Required. Name of outFile reference frame.").build());
options.addOption(Option.builder("spice").required().hasArg().desc(
"Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
.build());
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build()); return options;
}
private NavigableMap<TDBTime, Vector3> pointsIn;
private NavigableMap<TDBTime, Vector3> pointsOut;
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TransformFrame();
public TransformFrame() {}
Options options = defineOptions();
public void setPoints(NavigableMap<TDBTime, Vector3> pointsIn) {
this.pointsIn = pointsIn;
}
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
TransformFrame tf = new TransformFrame();
NavigableMap<TDBTime, Vector3> map = new TreeMap<>(SPICEUtil.tdbComparator);
try {
File f = new File(cl.getOptionValue("inFile"));
List<String> lines = FileUtils.readLines(f, Charset.defaultCharset());
for (String line : lines) {
String trim = line.trim();
if (trim.isEmpty() || trim.startsWith("#"))
continue;
String[] parts = trim.split(",");
double et = Double.parseDouble(parts[0].trim());
if (et > 0) {
TDBTime t = new TDBTime(et);
Vector3 v = new Vector3(Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()), Double.parseDouble(parts[3].trim()));
map.put(t, v);
public void transformCoordinates(String inFrame, String outFrame) {
try {
ReferenceFrame from = new ReferenceFrame(inFrame);
ReferenceFrame to = new ReferenceFrame(outFrame);
pointsOut = new TreeMap<>(SPICEUtil.tdbComparator);
for (TDBTime t : pointsIn.keySet()) {
Matrix33 transform = from.getPositionTransformation(to, t);
pointsOut.put(t, transform.mxv(pointsIn.get(t)));
}
} catch (SpiceException e) {
logger.error(e.getLocalizedMessage());
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage());
}
tf.setPoints(map);
public void write(String outFile) {
NativeLibraryLoader.loadSpiceLibraries();
try {
KernelDatabase.load(cl.getOptionValue("spice"));
} catch (SpiceErrorException e) {
logger.error(e.getLocalizedMessage());
try (PrintWriter pw = new PrintWriter(outFile)) {
for (TDBTime t : pointsOut.keySet()) {
Vector3 v = pointsOut.get(t);
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1), v.getElt(2));
}
} catch (FileNotFoundException | SpiceException e) {
logger.error(e.getLocalizedMessage());
}
}
tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame"));
tf.write(cl.getOptionValue("outFile"));
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("inFile")
.required()
.hasArg()
.desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.")
.build());
options.addOption(Option.builder("inFrame")
.required()
.hasArg()
.desc("Required. Name of inFile reference frame.")
.build());
options.addOption(Option.builder("outFile")
.required()
.hasArg()
.desc("Required. Name of output file. It will be in the same format as inFile.")
.build());
options.addOption(Option.builder("outFrame")
.required()
.hasArg()
.desc("Required. Name of outFile reference frame.")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TransformFrame();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
TransformFrame tf = new TransformFrame();
NavigableMap<TDBTime, Vector3> map = new TreeMap<>(SPICEUtil.tdbComparator);
try {
File f = new File(cl.getOptionValue("inFile"));
List<String> lines = FileUtils.readLines(f, Charset.defaultCharset());
for (String line : lines) {
String trim = line.trim();
if (trim.isEmpty() || trim.startsWith("#")) continue;
String[] parts = trim.split(",");
double et = Double.parseDouble(parts[0].trim());
if (et > 0) {
TDBTime t = new TDBTime(et);
Vector3 v = new Vector3(
Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()),
Double.parseDouble(parts[3].trim()));
map.put(t, v);
}
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage());
}
tf.setPoints(map);
NativeLibraryLoader.loadSpiceLibraries();
try {
KernelDatabase.load(cl.getOptionValue("spice"));
} catch (SpiceErrorException e) {
logger.error(e.getLocalizedMessage());
}
tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame"));
tf.write(cl.getOptionValue("outFile"));
}
}

View File

@@ -43,220 +43,232 @@ import terrasaur.utils.NativeLibraryLoader;
/**
* Translate time between formats.
*
*
* @author nairah1
*
*/
public class TranslateTime implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Convert between different time systems.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "\nConvert between different time systems.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private enum Types {
JULIAN, SCLK, TDB, TDBCALENDAR, UTC
}
private Map<Integer, SCLK> sclkMap;
private TranslateTime(){}
public TranslateTime(Map<Integer, SCLK> sclkMap) {
this.sclkMap = sclkMap;
}
private TDBTime tdb;
public String toJulian() throws SpiceErrorException {
return tdb.toString("JULIAND.######");
}
private SCLK sclkKernel;
public SCLK getSCLKKernel() {
return sclkKernel;
}
public void setSCLKKernel(int sclkID) {
sclkKernel = sclkMap.get(sclkID);
if (sclkKernel == null) {
logger.error("SCLK {} is not loaded!", sclkID);
}
}
public SCLKTime toSCLK() throws SpiceException {
return new SCLKTime(sclkKernel, tdb);
}
public TDBTime toTDB() {
return tdb;
}
public String toUTC() throws SpiceErrorException {
return tdb.toUTCString("ISOC", 3);
}
public void setJulianDate(double julianDate) throws SpiceErrorException {
tdb = new TDBTime(String.format("%.6f JDUTC", julianDate));
}
public void setSCLK(String sclkString) throws SpiceException {
tdb = new TDBTime(new SCLKTime(sclkKernel, sclkString));
}
public void setTDB(double tdb) {
this.tdb = new TDBTime(tdb);
}
public void setTDBCalendarString(String tdbString) throws SpiceErrorException {
tdb = new TDBTime(String.format("%s TDB", tdbString));
}
public void setUTC(String utcStr) throws SpiceErrorException {
tdb = new TDBTime(utcStr);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("sclk").hasArg().desc(
"SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
.build());
options.addOption(Option.builder("spice").required().hasArg()
.desc("Required. SPICE metakernel containing leap second and SCLK.").build());
options.addOption(Option.builder("gui").desc("Launch a GUI.").build());
options.addOption(Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
sb = new StringBuilder();
for (Types system : Types.values()) {
sb.append(String.format("%s ", system.name()));
}
options.addOption(Option.builder("inputSystem").hasArg().desc(
"Timesystem of inputDate. Valid values are " + sb.toString().trim() + ". Default is UTC.")
.build()); return options;
}
public static void main(String[] args) throws SpiceException {
TerrasaurTool defaultOBJ = new TranslateTime();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// This is to avoid java crashing due to inability to connect to an X display
if (!cl.hasOption("gui"))
System.setProperty("java.awt.headless", "true");
NativeLibraryLoader.loadSpiceLibraries();
for (String kernel : cl.getOptionValues("spice"))
KernelDatabase.load(kernel);
LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>();
String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*");
for (String s : sclk_data_type) {
String[] parts = s.split("_");
int sclkID = -Integer.parseInt(parts[parts.length - 1]);
sclkMap.put(sclkID, new SCLK(sclkID));
@Override
public String shortDescription() {
return "Convert between different time systems.";
}
SCLK sclk = null;
if (cl.hasOption("sclk")) {
int sclkID = Integer.parseInt(cl.getOptionValue("sclk"));
if (sclkMap.containsKey(sclkID))
sclk = sclkMap.get(sclkID);
else {
logger.error("Cannot find SCLK {} in kernel pool!", sclkID);
StringBuilder sb = new StringBuilder();
for (Integer id : sclkMap.keySet())
sb.append(String.format("%d ", id));
logger.error("Loaded IDs are {}", sb.toString());
}
} else {
if (!sclkMap.values().isEmpty())
// set the SCLK to the first one found
sclk = sclkMap.values().stream().toList().get(0);
@Override
public String fullDescription(Options options) {
String header = "";
String footer = "\nConvert between different time systems.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
if (sclk == null) {
logger.fatal("Cannot load SCLK");
System.exit(0);
private enum Types {
JULIAN,
SCLK,
TDB,
TDBCALENDAR,
UTC
}
TranslateTime tt = new TranslateTime(sclkMap);
private Map<Integer, SCLK> sclkMap;
if (cl.hasOption("gui")) {
TranslateTimeFX.setTranslateTime(tt);
TranslateTimeFX.setSCLKIDs(sclkMap.keySet());
TranslateTimeFX.main(args);
System.exit(0);
} else {
if (!cl.hasOption("inputDate")) {
logger.fatal("Missing required option -inputDate!");
System.exit(1);
}
tt.setSCLKKernel(sclk.getIDCode());
private TranslateTime() {}
public TranslateTime(Map<Integer, SCLK> sclkMap) {
this.sclkMap = sclkMap;
}
StringBuilder sb = new StringBuilder();
for (String s : cl.getOptionValues("inputDate"))
sb.append(String.format("%s ", s));
String inputDate = sb.toString().trim();
private TDBTime tdb;
Types type =
cl.hasOption("inputSystem") ? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
: Types.UTC;
switch (type) {
case JULIAN:
tt.setJulianDate(Double.parseDouble(inputDate));
break;
case SCLK:
tt.setSCLK(inputDate);
break;
case TDB:
tt.setTDB(Double.parseDouble(inputDate));
break;
case TDBCALENDAR:
tt.setTDBCalendarString(inputDate);
break;
case UTC:
tt.setUTC(inputDate);
break;
public String toJulian() throws SpiceErrorException {
return tdb.toString("JULIAND.######");
}
System.out.printf("# input date %s (%s)\n", inputDate, type.name());
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n",
sclk.getIDCode());
private SCLK sclkKernel;
String utcString = tt.toTDB().toUTCString("ISOC", 3);
String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB");
String doyString = tt.toTDB().toString("DOY");
public SCLK getSCLKKernel() {
return sclkKernel;
}
System.out.printf("%s, %s, %s, %.6f, %s, %s\n", utcString, tdbString, doyString,
tt.toTDB().getTDBSeconds(), tt.toJulian(), tt.toSCLK().toString());
public void setSCLKKernel(int sclkID) {
sclkKernel = sclkMap.get(sclkID);
if (sclkKernel == null) {
logger.error("SCLK {} is not loaded!", sclkID);
}
}
}
public SCLKTime toSCLK() throws SpiceException {
return new SCLKTime(sclkKernel, tdb);
}
public TDBTime toTDB() {
return tdb;
}
public String toUTC() throws SpiceErrorException {
return tdb.toUTCString("ISOC", 3);
}
public void setJulianDate(double julianDate) throws SpiceErrorException {
tdb = new TDBTime(String.format("%.6f JDUTC", julianDate));
}
public void setSCLK(String sclkString) throws SpiceException {
tdb = new TDBTime(new SCLKTime(sclkKernel, sclkString));
}
public void setTDB(double tdb) {
this.tdb = new TDBTime(tdb);
}
public void setTDBCalendarString(String tdbString) throws SpiceErrorException {
tdb = new TDBTime(String.format("%s TDB", tdbString));
}
public void setUTC(String utcStr) throws SpiceErrorException {
tdb = new TDBTime(utcStr);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("sclk")
.hasArg()
.desc("SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("Required. SPICE metakernel containing leap second and SCLK.")
.build());
options.addOption(Option.builder("gui").desc("Launch a GUI.").build());
options.addOption(
Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
sb = new StringBuilder();
for (Types system : Types.values()) {
sb.append(String.format("%s ", system.name()));
}
options.addOption(Option.builder("inputSystem")
.hasArg()
.desc("Timesystem of inputDate. Valid values are "
+ sb.toString().trim() + ". Default is UTC.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException {
TerrasaurTool defaultOBJ = new TranslateTime();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// This is to avoid java crashing due to inability to connect to an X display
if (!cl.hasOption("gui")) System.setProperty("java.awt.headless", "true");
NativeLibraryLoader.loadSpiceLibraries();
for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>();
String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*");
for (String s : sclk_data_type) {
String[] parts = s.split("_");
int sclkID = -Integer.parseInt(parts[parts.length - 1]);
sclkMap.put(sclkID, new SCLK(sclkID));
}
SCLK sclk = null;
if (cl.hasOption("sclk")) {
int sclkID = Integer.parseInt(cl.getOptionValue("sclk"));
if (sclkMap.containsKey(sclkID)) sclk = sclkMap.get(sclkID);
else {
logger.error("Cannot find SCLK {} in kernel pool!", sclkID);
StringBuilder sb = new StringBuilder();
for (Integer id : sclkMap.keySet()) sb.append(String.format("%d ", id));
logger.error("Loaded IDs are {}", sb.toString());
}
} else {
if (!sclkMap.values().isEmpty())
// set the SCLK to the first one found
sclk = sclkMap.values().stream().toList().get(0);
}
if (sclk == null) {
logger.fatal("Cannot load SCLK");
System.exit(0);
}
TranslateTime tt = new TranslateTime(sclkMap);
if (cl.hasOption("gui")) {
TranslateTimeFX.setTranslateTime(tt);
TranslateTimeFX.setSCLKIDs(sclkMap.keySet());
TranslateTimeFX.main(args);
System.exit(0);
} else {
if (!cl.hasOption("inputDate")) {
logger.fatal("Missing required option -inputDate!");
System.exit(1);
}
tt.setSCLKKernel(sclk.getIDCode());
}
StringBuilder sb = new StringBuilder();
for (String s : cl.getOptionValues("inputDate")) sb.append(String.format("%s ", s));
String inputDate = sb.toString().trim();
Types type = cl.hasOption("inputSystem")
? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
: Types.UTC;
switch (type) {
case JULIAN:
tt.setJulianDate(Double.parseDouble(inputDate));
break;
case SCLK:
tt.setSCLK(inputDate);
break;
case TDB:
tt.setTDB(Double.parseDouble(inputDate));
break;
case TDBCALENDAR:
tt.setTDBCalendarString(inputDate);
break;
case UTC:
tt.setUTC(inputDate);
break;
}
System.out.printf("# input date %s (%s)\n", inputDate, type.name());
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n", sclk.getIDCode());
String utcString = tt.toTDB().toUTCString("ISOC", 3);
String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB");
String doyString = tt.toTDB().toString("DOY");
System.out.printf(
"%s, %s, %s, %.6f, %s, %s\n",
utcString,
tdbString,
doyString,
tt.toTDB().getTDBSeconds(),
tt.toJulian(),
tt.toSCLK().toString());
}
}

View File

@@ -0,0 +1,159 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.io.*;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.io.FilenameUtils;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.ICQUtils;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataUtil;
import vtk.vtkPolyData;
public class TriAx implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private TriAx() {}
@Override
public String shortDescription() {
return "Generate a triaxial ellipsoid in ICQ format.";
}
@Override
public String fullDescription(Options options) {
String footer =
"\nTriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("A")
.required()
.hasArg()
.desc("Long axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("B")
.required()
.hasArg()
.desc("Medium axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("C")
.required()
.hasArg()
.desc("Short axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("Q")
.required()
.hasArg()
.desc("ICQ size parameter. This is conventionally but not necessarily a power of 2.")
.build());
options.addOption(Option.builder("saveOBJ")
.desc("If present, save in OBJ format as well. "
+ "File will have the same name as ICQ file with an OBJ extension.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.required()
.desc("Name of ICQ file to write.")
.build());
return options;
}
static final int MAX_Q = 512;
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TriAx();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
int q = Integer.parseInt(cl.getOptionValue("Q"));
String shapefile = cl.getOptionValue("output");
double[] ax = new double[3];
ax[0] = Double.parseDouble(cl.getOptionValue("A"));
ax[1] = Double.parseDouble(cl.getOptionValue("B"));
ax[2] = Double.parseDouble(cl.getOptionValue("C"));
double[][][][] vec = new double[3][MAX_Q + 1][MAX_Q + 1][6];
for (int f = 0; f < 6; f++) {
for (int i = 0; i <= q; i++) {
for (int j = 0; j <= q; j++) {
double[] u = ICQUtils.xyf2u(q, i, j, f, ax);
double z = 1
/ Math.sqrt(
Math.pow(u[0] / ax[0], 2) + Math.pow(u[1] / ax[1], 2) + Math.pow(u[2] / ax[2], 2));
double[] v = new Vector3D(u).scalarMultiply(z).toArray();
for (int k = 0; k < 3; k++) {
vec[k][i][j][f] = v[k];
}
}
}
}
ICQUtils.writeICQ(q, vec, shapefile);
if (cl.hasOption("saveOBJ")) {
String basename = FilenameUtils.getBaseName(shapefile);
String dirname = FilenameUtils.getFullPath(shapefile);
if (dirname.isEmpty()) dirname = ".";
File obj = new File(dirname, basename + ".obj");
NativeLibraryLoader.loadVtkLibraries();
try {
vtkPolyData polydata = PolyDataUtil.loadShapeModel(shapefile);
if (polydata == null) {
logger.error("Cannot read {}", shapefile);
System.exit(0);
}
polydata = PolyDataUtil.removeDuplicatePoints(polydata);
polydata = PolyDataUtil.removeUnreferencedPoints(polydata);
polydata = PolyDataUtil.removeZeroAreaFacets(polydata);
PolyDataUtil.saveShapeModelAsOBJ(polydata, obj.getPath());
} catch (Exception e) {
logger.error(e);
}
}
}
}

View File

@@ -30,7 +30,6 @@ import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
@@ -46,238 +45,238 @@ import vtk.vtkOBJReader;
import vtk.vtkPolyData;
public class ValidateNormals implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
private ValidateNormals() {}
private ValidateNormals() {}
@Override
public String shortDescription() {
return "Check facet normal directions for an OBJ shape file.";
}
@Override
public String fullDescription(Options options) {
String footer =
"\nThis program checks that the normals of the shape model are not pointing inward.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("origin")
.hasArg()
.desc(
"If present, center of body in xyz coordinates. "
+ "Specify as three floating point values separated by commas. Default is to use the centroid of "
+ "the input shape model.")
.build());
options.addOption(
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build());
options.addOption(
Option.builder("output")
.hasArg()
.desc("Write out new OBJ file with corrected vertex orders for facets.")
.build());
options.addOption(
Option.builder("numThreads")
.hasArg()
.desc("Number of threads to run. Default is 1.")
.build());
return options;
}
private vtkPolyData polyData;
private ThreadLocal<vtkOBBTree> threadLocalsearchTree;
private double[] origin;
public ValidateNormals(vtkPolyData polyData) {
this.polyData = polyData;
PolyDataStatistics stats = new PolyDataStatistics(polyData);
origin = stats.getCentroid();
threadLocalsearchTree = new ThreadLocal<>();
}
public vtkOBBTree getOBBTree() {
vtkOBBTree searchTree = threadLocalsearchTree.get();
if (searchTree == null) {
searchTree = new vtkOBBTree();
searchTree.SetDataSet(polyData);
searchTree.SetTolerance(1e-12);
searchTree.BuildLocator();
threadLocalsearchTree.set(searchTree);
}
return searchTree;
}
public void setOrigin(double[] origin) {
this.origin = origin;
}
private class FlippedNormalFinder implements Callable<List<Long>> {
private static final DateTimeFormatter defaultFormatter =
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
.withLocale(Locale.getDefault())
.withZone(ZoneId.systemDefault());
private final long index0;
private final long index1;
public FlippedNormalFinder(long index0, long index1) {
this.index0 = index0;
this.index1 = index1;
@Override
public String shortDescription() {
return "Check facet normal directions for an OBJ shape file.";
}
@Override
public List<Long> call() {
public String fullDescription(Options options) {
logger.info("Thread {}: indices {} to {}", Thread.currentThread().threadId(), index0, index1);
vtkIdList idList = new vtkIdList();
vtkIdList cellIds = new vtkIdList();
List<Long> flippedNormals = new ArrayList<>();
String footer = "\nThis program checks that the normals of the shape model are not pointing inward.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
final long startTime = Instant.now().getEpochSecond();
final long numFacets = index1 - index0;
for (int i = 0; i < numFacets; ++i) {
static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("fast")
.desc("If present, only check for overhangs if center and normal point in opposite "
+ "directions. Default behavior is to always check for intersections between body center "
+ "and facet center.")
.build());
options.addOption(Option.builder("origin")
.hasArg()
.desc("If present, center of body in xyz coordinates. "
+ "Specify as three floating point values separated by commas. Default is to use the centroid of "
+ "the input shape model.")
.build());
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.desc("Write out new OBJ file with corrected vertex orders for facets.")
.build());
options.addOption(Option.builder("numThreads")
.hasArg()
.desc("Number of threads to run. Default is 1.")
.build());
return options;
}
if (i > 0 && i % (numFacets / 10) == 0) {
double pctDone = i / (numFacets * .01);
long elapsed = Instant.now().getEpochSecond() - startTime;
long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime);
String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish));
logger.info(
String.format(
"Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s",
Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish));
private vtkPolyData polyData;
private ThreadLocal<vtkOBBTree> threadLocalsearchTree;
private double[] origin;
public ValidateNormals(vtkPolyData polyData) {
this.polyData = polyData;
PolyDataStatistics stats = new PolyDataStatistics(polyData);
origin = stats.getCentroid();
threadLocalsearchTree = new ThreadLocal<>();
}
public vtkOBBTree getOBBTree() {
vtkOBBTree searchTree = threadLocalsearchTree.get();
if (searchTree == null) {
searchTree = new vtkOBBTree();
searchTree.SetDataSet(polyData);
searchTree.SetTolerance(1e-12);
searchTree.BuildLocator();
threadLocalsearchTree.set(searchTree);
}
return searchTree;
}
public void setOrigin(double[] origin) {
this.origin = origin;
}
private class FlippedNormalFinder implements Callable<List<Long>> {
private static final DateTimeFormatter defaultFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
.withLocale(Locale.getDefault())
.withZone(ZoneId.systemDefault());
private final long index0;
private final long index1;
private final boolean fast;
public FlippedNormalFinder(long index0, long index1, boolean fast) {
this.index0 = index0;
this.index1 = index1;
this.fast = fast;
}
long index = index0 + i;
@Override
public List<Long> call() {
CellInfo ci = CellInfo.getCellInfo(polyData, index, idList);
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
logger.info("Thread {}: indices {} to {}", Thread.currentThread().threadId(), index0, index1);
vtkIdList idList = new vtkIdList();
vtkIdList cellIds = new vtkIdList();
List<Long> flippedNormals = new ArrayList<>();
// count up all crossings of the surface between the origin and the facet.
int numCrossings = 0;
for (int j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == index) break;
numCrossings++;
final long startTime = Instant.now().getEpochSecond();
final long numFacets = index1 - index0;
for (int i = 0; i < numFacets; ++i) {
if (i > 0 && i % (numFacets / 10) == 0) {
double pctDone = i / (numFacets * .01);
long elapsed = Instant.now().getEpochSecond() - startTime;
long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime);
String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish));
logger.info(String.format(
"Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s",
Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish));
}
long index = index0 + i;
CellInfo ci = CellInfo.getCellInfo(polyData, index, idList);
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
int numCrossings = 0;
if (isOpposite || !fast) {
// count up all crossings of the surface between the origin and the facet.
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
for (int j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == index) break;
numCrossings++;
}
}
// if numCrossings is even, the radial and normal should point in the same direction. If it
// is odd, the radial and normal should point in opposite directions.
boolean shouldBeOpposite = (numCrossings % 2 == 1);
// XOR operator - true if both conditions are different
if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index);
}
return flippedNormals;
}
}
public void flipNormals(Collection<Long> facets) {
vtkCellArray cells = new vtkCellArray();
for (long i = 0; i < polyData.GetNumberOfCells(); ++i) {
vtkIdList idList = new vtkIdList();
polyData.GetCellPoints(i, idList);
if (facets.contains(i)) {
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
idList.SetId(0, id0);
idList.SetId(1, id2);
idList.SetId(2, id1);
}
cells.InsertNextCell(idList);
}
polyData.SetPolys(cells);
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new ValidateNormals();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
// PolyDataUtil's OBJ reader messes with the normals - not reliable for a local obj
vtkOBJReader smallBodyReader = new vtkOBJReader();
smallBodyReader.SetFileName(cl.getOptionValue("obj"));
smallBodyReader.Update();
vtkPolyData polyData = new vtkPolyData();
polyData.ShallowCopy(smallBodyReader.GetOutput());
smallBodyReader.Delete();
ValidateNormals app = new ValidateNormals(polyData);
logger.info("Read {} facets from {}", polyData.GetNumberOfCells(), cl.getOptionValue("obj"));
if (cl.hasOption("origin")) {
String[] parts = cl.getOptionValue("origin").split(",");
double[] origin = new double[3];
for (int i = 0; i < 3; i++) origin[i] = Double.parseDouble(parts[i]);
app.setOrigin(origin);
}
// if numCrossings is even, the radial and normal should point in the same direction. If it
// is odd, the
// radial and normal should point in opposite directions.
boolean shouldBeOpposite = (numCrossings % 2 == 1);
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
Set<Long> flippedNormals = new HashSet<>();
// XOR operator - true if both conditions are different
if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index);
}
boolean fast = cl.hasOption("fast");
int numThreads = cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Future<List<Long>>> futures = new ArrayList<>();
return flippedNormals;
}
}
long numFacets = polyData.GetNumberOfCells() / numThreads;
for (int i = 0; i < numThreads; i++) {
long fromIndex = i * numFacets;
long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets);
public void flipNormals(Collection<Long> facets) {
vtkCellArray cells = new vtkCellArray();
for (long i = 0; i < polyData.GetNumberOfCells(); ++i) {
vtkIdList idList = new vtkIdList();
polyData.GetCellPoints(i, idList);
if (facets.contains(i)) {
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
idList.SetId(0, id0);
idList.SetId(1, id2);
idList.SetId(2, id1);
}
cells.InsertNextCell(idList);
}
polyData.SetPolys(cells);
}
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex, fast);
futures.add(executor.submit(fnf));
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new ValidateNormals();
for (Future<List<Long>> future : futures) flippedNormals.addAll(future.get());
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
// PolyDataUtil's OBJ reader messes with the normals - not reliable for a local obj
vtkOBJReader smallBodyReader = new vtkOBJReader();
smallBodyReader.SetFileName(cl.getOptionValue("obj"));
smallBodyReader.Update();
vtkPolyData polyData = new vtkPolyData();
polyData.ShallowCopy(smallBodyReader.GetOutput());
smallBodyReader.Delete();
ValidateNormals app = new ValidateNormals(polyData);
logger.info("Read {} facets from {}", polyData.GetNumberOfCells(), cl.getOptionValue("obj"));
if (cl.hasOption("origin")) {
String[] parts = cl.getOptionValue("origin").split(",");
double[] origin = new double[3];
for (int i = 0; i < 3; i++) origin[i] = Double.parseDouble(parts[i]);
app.setOrigin(origin);
}
Set<Long> flippedNormals = new HashSet<>();
int numThreads =
cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Future<List<Long>>> futures = new ArrayList<>();
long numFacets = polyData.GetNumberOfCells() / numThreads;
for (int i = 0; i < numThreads; i++) {
long fromIndex = i * numFacets;
long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets);
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex);
futures.add(executor.submit(fnf));
}
for (Future<List<Long>> future : futures) flippedNormals.addAll(future.get());
executor.shutdown();
}
logger.info(
"Found {} flipped normals out of {} facets",
flippedNormals.size(),
polyData.GetNumberOfCells());
if (cl.hasOption("output")) {
NavigableSet<Long> sorted = new TreeSet<>(flippedNormals);
String header = "";
if (!flippedNormals.isEmpty()) {
header = "# The following indices were flipped from " + cl.getOptionValue("obj") + ":\n";
StringBuilder sb = new StringBuilder("# ");
for (Long index : sorted) {
sb.append(String.format("%d", index));
if (index < sorted.last()) sb.append(", ");
executor.shutdown();
}
sb.append("\n");
header += WordUtils.wrap(sb.toString(), 80, "\n# ", false);
logger.info(header);
}
app.flipNormals(flippedNormals);
PolyDataUtil.saveShapeModelAsOBJ(app.polyData, cl.getOptionValue("output"), header);
logger.info("wrote OBJ file {}", cl.getOptionValue("output"));
logger.info("Found {} flipped normals out of {} facets", flippedNormals.size(), polyData.GetNumberOfCells());
if (cl.hasOption("output")) {
NavigableSet<Long> sorted = new TreeSet<>(flippedNormals);
String header = "";
if (!flippedNormals.isEmpty()) {
header = "# The following indices were flipped from " + cl.getOptionValue("obj") + ":\n";
StringBuilder sb = new StringBuilder("# ");
for (Long index : sorted) {
sb.append(String.format("%d", index));
if (index < sorted.last()) sb.append(", ");
}
sb.append("\n");
header += WordUtils.wrap(sb.toString(), 80, "\n# ", false);
logger.info(header);
}
app.flipNormals(flippedNormals);
PolyDataUtil.saveShapeModelAsOBJ(app.polyData, cl.getOptionValue("output"), header);
logger.info("wrote OBJ file {}", cl.getOptionValue("output"));
}
logger.info("ValidateNormals done");
}
logger.info("ValidateNormals done");
}
}

View File

@@ -51,374 +51,361 @@ import vtk.vtkPolyData;
*/
public class ValidateOBJ implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Check a closed shape file in OBJ format for errors.";
}
@Override
public String shortDescription() {
return "Check a closed shape file in OBJ format for errors.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"""
This program checks that a shape model has the correct number of facets and vertices. \
It will also check for duplicate vertices, vertices that are not referenced by any facet, and zero area facets.
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private vtkPolyData polyData;
private String validationMsg;
private ValidateOBJ() {}
public ValidateOBJ(vtkPolyData polyData) {
this.polyData = polyData;
}
/**
* @return {@link vtkPolyData#GetNumberOfCells()}
*/
public long facetCount() {
return polyData.GetNumberOfCells();
}
/**
* @return {@link vtkPolyData#GetNumberOfPoints()}
*/
public long vertexCount() {
return polyData.GetNumberOfPoints();
}
/**
* @return description of test result
*/
public String getMessage() {
return validationMsg;
}
/**
* @return true if number of facets in the shape model satisfies 3*4^n where n is an integer
*/
public boolean testFacets() {
boolean meetsCondition = facetCount() % 3 == 0;
if (meetsCondition) {
long facet3 = facetCount() / 3;
double logFacet3 = Math.log(facet3) / Math.log(4);
if (Math.ceil(logFacet3) != Math.floor(logFacet3)) meetsCondition = false;
return TerrasaurTool.super.fullDescription(options, header, footer);
}
int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5);
if (meetsCondition) {
validationMsg =
String.format(
"Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
} else {
validationMsg =
String.format(
"Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.",
facetCount(), 3 * Math.pow(4, n), n);
private vtkPolyData polyData;
private String validationMsg;
private ValidateOBJ() {}
public ValidateOBJ(vtkPolyData polyData) {
this.polyData = polyData;
}
return meetsCondition;
}
/**
* @return true if number of vertices in the shape model satisfies v=f/2+2
*/
public boolean testVertices() {
boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount();
if (meetsCondition)
validationMsg =
String.format(
"Model has %d vertices and %d facets. This satisfies v = f/2+2.",
vertexCount(), facetCount());
else
validationMsg =
String.format(
"Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.",
vertexCount(), facetCount(), facetCount() / 2 + 2);
return meetsCondition;
}
/**
* @return key is vertex id, value is a list of vertices within a hard coded distance of 1e-10.
*/
public NavigableMap<Long, List<Long>> findDuplicateVertices() {
SmallBodyModel sbm = new SmallBodyModel(polyData);
double[] iPt = new double[3];
NavigableMap<Long, List<Long>> map = new TreeMap<>();
double tol = 1e-10;
for (long i = 0; i < vertexCount(); i++) {
polyData.GetPoint(i, iPt);
List<Long> closestVertices = new ArrayList<>();
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol))
if (id > i) closestVertices.add(id);
if (!closestVertices.isEmpty()) map.put(i, closestVertices);
if (map.containsKey(i) && !map.get(i).isEmpty()) {
StringBuilder sb = new StringBuilder();
sb.append(String.format("Duplicates for vertex %d: ", i + 1));
for (Long dupId : map.get(i)) sb.append(String.format("%d ", dupId + 1));
logger.debug(sb.toString());
}
/**
* @return {@link vtkPolyData#GetNumberOfCells()}
*/
public long facetCount() {
return polyData.GetNumberOfCells();
}
validationMsg = String.format("%d vertices have duplicates", map.size());
/**
* @return {@link vtkPolyData#GetNumberOfPoints()}
*/
public long vertexCount() {
return polyData.GetNumberOfPoints();
}
return map;
}
/**
* @return description of test result
*/
public String getMessage() {
return validationMsg;
}
/**
* @return a list of vertex indices where one or more of the coordinates fail {@link
* Double#isFinite(double)}.
*/
public List<Integer> findMalformedVertices() {
double[] iPt = new double[3];
NavigableSet<Integer> vertexIndices = new TreeSet<>();
for (int i = 0; i < vertexCount(); i++) {
polyData.GetPoint(i, iPt);
for (int j = 0; j < 3; j++) {
if (!Double.isFinite(iPt[j])) {
logger.debug("Vertex {}: {} {} {}", i, iPt[0], iPt[1], iPt[2]);
vertexIndices.add(i);
break;
/**
* @return true if number of facets in the shape model satisfies 3*4^n where n is an integer
*/
public boolean testFacets() {
boolean meetsCondition = facetCount() % 3 == 0;
if (meetsCondition) {
long facet3 = facetCount() / 3;
double logFacet3 = Math.log(facet3) / Math.log(4);
if (Math.ceil(logFacet3) != Math.floor(logFacet3)) meetsCondition = false;
}
}
}
validationMsg = String.format("%d malformed vertices ", vertexIndices.size());
return new ArrayList<>(vertexIndices);
}
/**
* @return a list of vertex indices that are not referenced by any facet
*/
public List<Long> findUnreferencedVertices() {
NavigableSet<Long> vertexIndices = new TreeSet<>();
for (long i = 0; i < polyData.GetNumberOfPoints(); i++) {
vertexIndices.add(i);
int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5);
if (meetsCondition) {
validationMsg =
String.format("Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
} else {
validationMsg = String.format(
"Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.",
facetCount(), 3 * Math.pow(4, n), n);
}
return meetsCondition;
}
vtkIdList idList = new vtkIdList();
for (int i = 0; i < facetCount(); ++i) {
polyData.GetCellPoints(i, idList);
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
/**
* @return true if number of vertices in the shape model satisfies v=f/2+2
*/
public boolean testVertices() {
boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount();
if (meetsCondition)
validationMsg = String.format(
"Model has %d vertices and %d facets. This satisfies v = f/2+2.", vertexCount(), facetCount());
else
validationMsg = String.format(
"Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.",
vertexCount(), facetCount(), facetCount() / 2 + 2);
vertexIndices.remove(id0);
vertexIndices.remove(id1);
vertexIndices.remove(id2);
return meetsCondition;
}
if (!vertexIndices.isEmpty()) {
double[] pt = new double[3];
for (long id : vertexIndices) {
polyData.GetPoint(id, pt);
logger.debug("Unreferenced vertex {} [{}, {}, {}]", id + 1, pt[0], pt[1], pt[2]);
// note OBJ vertices are numbered from 1 but VTK uses 0
}
/**
* @return key is vertex id, value is a list of vertices within a hard coded distance of 1e-10.
*/
public NavigableMap<Long, List<Long>> findDuplicateVertices() {
SmallBodyModel sbm = new SmallBodyModel(polyData);
double[] iPt = new double[3];
NavigableMap<Long, List<Long>> map = new TreeMap<>();
double tol = 1e-10;
for (long i = 0; i < vertexCount(); i++) {
polyData.GetPoint(i, iPt);
List<Long> closestVertices = new ArrayList<>();
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol)) if (id > i) closestVertices.add(id);
if (!closestVertices.isEmpty()) map.put(i, closestVertices);
if (map.containsKey(i) && !map.get(i).isEmpty()) {
StringBuilder sb = new StringBuilder();
sb.append(String.format("Duplicates for vertex %d: ", i + 1));
for (Long dupId : map.get(i)) sb.append(String.format("%d ", dupId + 1));
logger.debug(sb.toString());
}
}
validationMsg = String.format("%d vertices have duplicates", map.size());
return map;
}
validationMsg = String.format("%d unreferenced vertices found", vertexIndices.size());
return new ArrayList<>(vertexIndices);
}
/**
* @return a list of facet indices where the facet has zero area
*/
public List<Integer> findZeroAreaFacets() {
List<Integer> zeroAreaFacets = new ArrayList<>();
vtkIdList idList = new vtkIdList();
double[] pt0 = new double[3];
double[] pt1 = new double[3];
double[] pt2 = new double[3];
for (int i = 0; i < facetCount(); ++i) {
polyData.GetCellPoints(i, idList);
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
polyData.GetPoint(id0, pt0);
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = facet.getArea();
if (area == 0) {
zeroAreaFacets.add(i);
logger.debug(
"Facet {} has zero area. Vertices are {} [{}, {}, {}], {} [{}, {}, {}], and {} [{}, {}, {}]",
i + 1,
id0 + 1,
pt0[0],
pt0[1],
pt0[2],
id1 + 1,
pt1[0],
pt1[1],
pt1[2],
id2 + 1,
pt2[0],
pt2[1],
pt2[2]);
}
/**
* @return a list of vertex indices where one or more of the coordinates fail {@link
* Double#isFinite(double)}.
*/
public List<Integer> findMalformedVertices() {
double[] iPt = new double[3];
NavigableSet<Integer> vertexIndices = new TreeSet<>();
for (int i = 0; i < vertexCount(); i++) {
polyData.GetPoint(i, iPt);
for (int j = 0; j < 3; j++) {
if (!Double.isFinite(iPt[j])) {
logger.debug("Vertex {}: {} {} {}", i, iPt[0], iPt[1], iPt[2]);
vertexIndices.add(i);
break;
}
}
}
validationMsg = String.format("%d malformed vertices ", vertexIndices.size());
return new ArrayList<>(vertexIndices);
}
validationMsg = String.format("%d zero area facets found", zeroAreaFacets.size());
return zeroAreaFacets;
}
/**
* @return a list of vertex indices that are not referenced by any facet
*/
public List<Long> findUnreferencedVertices() {
NavigableSet<Long> vertexIndices = new TreeSet<>();
for (long i = 0; i < polyData.GetNumberOfPoints(); i++) {
vertexIndices.add(i);
}
/**
* @return statistics on the angle between the facet radial and normal vectors
*/
public DescriptiveStatistics normalStats() {
DescriptiveStatistics stats = new DescriptiveStatistics();
vtkIdList idList = new vtkIdList();
for (int i = 0; i < facetCount(); ++i) {
polyData.GetCellPoints(i, idList);
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
VectorStatistics cStats = new VectorStatistics();
VectorStatistics nStats = new VectorStatistics();
vertexIndices.remove(id0);
vertexIndices.remove(id1);
vertexIndices.remove(id2);
}
vtkIdList idList = new vtkIdList();
double[] pt0 = new double[3];
double[] pt1 = new double[3];
double[] pt2 = new double[3];
for (int i = 0; i < facetCount(); ++i) {
polyData.GetCellPoints(i, idList);
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
polyData.GetPoint(id0, pt0);
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
if (!vertexIndices.isEmpty()) {
double[] pt = new double[3];
for (long id : vertexIndices) {
polyData.GetPoint(id, pt);
logger.debug("Unreferenced vertex {} [{}, {}, {}]", id + 1, pt[0], pt[1], pt[2]);
// note OBJ vertices are numbered from 1 but VTK uses 0
}
}
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
if (facet.getArea() > 0) {
stats.addValue(facet.getCenter().getDot(facet.getNormal()));
cStats.add(facet.getCenter());
nStats.add(facet.getNormal());
}
validationMsg = String.format("%d unreferenced vertices found", vertexIndices.size());
return new ArrayList<>(vertexIndices);
}
validationMsg =
String.format(
"Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, "
+ "angle between mean radial and mean normal is %f degrees",
stats.getN(),
Math.toDegrees(Math.acos(stats.getMean())),
Math.toDegrees(Vector3D.angle(cStats.getMean(), nStats.getMean())));
/**
* @return a list of facet indices where the facet has zero area
*/
public List<Integer> findZeroAreaFacets() {
List<Integer> zeroAreaFacets = new ArrayList<>();
vtkIdList idList = new vtkIdList();
double[] pt0 = new double[3];
double[] pt1 = new double[3];
double[] pt2 = new double[3];
return stats;
}
for (int i = 0; i < facetCount(); ++i) {
polyData.GetCellPoints(i, idList);
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
polyData.GetPoint(id0, pt0);
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build());
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("output").hasArg().desc("Filename for output OBJ.").build());
options.addOption(
Option.builder("removeDuplicateVertices")
.desc("Remove duplicate vertices. Use with -output to save OBJ.")
.build());
options.addOption(
Option.builder("removeUnreferencedVertices")
.desc("Remove unreferenced vertices. Use with -output to save OBJ.")
.build());
options.addOption(
Option.builder("removeZeroAreaFacets")
.desc("Remove facets with zero area. Use with -output to save OBJ.")
.build());
options.addOption(
Option.builder("cleanup")
.desc(
"Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
.build());
return options;
}
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = facet.getArea();
if (area == 0) {
zeroAreaFacets.add(i);
logger.debug(
"Facet {} has zero area. Vertices are {} [{}, {}, {}], {} [{}, {}, {}], and {} [{}, {}, {}]",
i + 1,
id0 + 1,
pt0[0],
pt0[1],
pt0[2],
id1 + 1,
pt1[0],
pt1[1],
pt1[2],
id2 + 1,
pt2[0],
pt2[1],
pt2[2]);
}
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new ValidateOBJ();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries();
vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
if (polyData == null) {
logger.error("Cannot read {}, exiting.", cl.getOptionValue("obj"));
System.exit(0);
validationMsg = String.format("%d zero area facets found", zeroAreaFacets.size());
return zeroAreaFacets;
}
ValidateOBJ vo = new ValidateOBJ(polyData);
/**
* @return statistics on the angle between the facet radial and normal vectors
*/
public DescriptiveStatistics normalStats() {
DescriptiveStatistics stats = new DescriptiveStatistics();
logger.log(vo.testFacets() ? Level.INFO : Level.WARN, vo.getMessage());
logger.log(vo.testVertices() ? Level.INFO : Level.WARN, vo.getMessage());
VectorStatistics cStats = new VectorStatistics();
VectorStatistics nStats = new VectorStatistics();
DescriptiveStatistics stats = vo.normalStats();
logger.log(stats.getMean() > 0 ? Level.INFO : Level.WARN, vo.getMessage());
vtkIdList idList = new vtkIdList();
double[] pt0 = new double[3];
double[] pt1 = new double[3];
double[] pt2 = new double[3];
for (int i = 0; i < facetCount(); ++i) {
polyData.GetCellPoints(i, idList);
long id0 = idList.GetId(0);
long id1 = idList.GetId(1);
long id2 = idList.GetId(2);
polyData.GetPoint(id0, pt0);
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
List<Integer> mfv = vo.findMalformedVertices();
logger.log(!mfv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
if (facet.getArea() > 0) {
stats.addValue(facet.getCenter().createUnitized().getDot(facet.getNormal()));
cStats.add(facet.getCenter());
nStats.add(facet.getNormal());
}
}
NavigableMap<Long, List<Long>> dv = vo.findDuplicateVertices();
logger.log(!dv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
validationMsg = String.format(
"Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, "
+ "angle between mean radial and mean normal is %f degrees",
stats.getN(),
Math.toDegrees(Math.acos(stats.getMean())),
Math.toDegrees(Vector3D.angle(cStats.getMean(), nStats.getMean())));
List<Long> urv = vo.findUnreferencedVertices();
logger.log(!urv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
List<Integer> zaf = vo.findZeroAreaFacets();
logger.log(!zaf.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
final boolean cleanup = cl.hasOption("cleanup");
final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices");
final boolean removeUnreferencedVertices =
cleanup || cl.hasOption("removeUnreferencedVertices");
final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets");
if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData);
if (removeUnreferencedVertices) polyData = PolyDataUtil.removeUnreferencedPoints(polyData);
if (removeZeroAreaFacets) polyData = PolyDataUtil.removeZeroAreaFacets(polyData);
if (cl.hasOption("output")) {
PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output"));
logger.info(String.format("Wrote OBJ file %s", cl.getOptionValue("output")));
return stats;
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.desc("Filename for output OBJ.")
.build());
options.addOption(Option.builder("removeDuplicateVertices")
.desc("Remove duplicate vertices. Use with -output to save OBJ.")
.build());
options.addOption(Option.builder("removeUnreferencedVertices")
.desc("Remove unreferenced vertices. Use with -output to save OBJ.")
.build());
options.addOption(Option.builder("removeZeroAreaFacets")
.desc("Remove facets with zero area. Use with -output to save OBJ.")
.build());
options.addOption(Option.builder("cleanup")
.desc("Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
.build());
return options;
}
public static void main(String[] args) throws Exception {
TerrasaurTool defaultOBJ = new ValidateOBJ();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
if (polyData == null) {
logger.error("Cannot read {}, exiting.", cl.getOptionValue("obj"));
System.exit(0);
}
ValidateOBJ vo = new ValidateOBJ(polyData);
logger.log(vo.testFacets() ? Level.INFO : Level.WARN, vo.getMessage());
logger.log(vo.testVertices() ? Level.INFO : Level.WARN, vo.getMessage());
DescriptiveStatistics stats = vo.normalStats();
logger.log(stats.getMean() > 0 ? Level.INFO : Level.WARN, vo.getMessage());
List<Integer> mfv = vo.findMalformedVertices();
logger.log(!mfv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
NavigableMap<Long, List<Long>> dv = vo.findDuplicateVertices();
logger.log(!dv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
List<Long> urv = vo.findUnreferencedVertices();
logger.log(!urv.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
List<Integer> zaf = vo.findZeroAreaFacets();
logger.log(!zaf.isEmpty() ? Level.WARN : Level.INFO, vo.getMessage());
final boolean cleanup = cl.hasOption("cleanup");
final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices");
final boolean removeUnreferencedVertices = cleanup || cl.hasOption("removeUnreferencedVertices");
final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets");
if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData);
if (removeUnreferencedVertices) polyData = PolyDataUtil.removeUnreferencedPoints(polyData);
if (removeZeroAreaFacets) polyData = PolyDataUtil.removeZeroAreaFacets(polyData);
if (cl.hasOption("output")) {
PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output"));
logger.info("Wrote OBJ file {}", cl.getOptionValue("output"));
}
}
}
}

View File

@@ -22,33 +22,36 @@
*/
package terrasaur.config;
import java.util.List;
import jackfruit.annotations.Comment;
import jackfruit.annotations.DefaultValue;
import jackfruit.annotations.Jackfruit;
import java.util.List;
@Jackfruit
public interface CKFromSumFileConfig {
@Comment("""
@Comment(
"""
Body fixed frame for the target body. If blank, use SPICE-defined
body fixed frame. This will be the reference frame unless the J2000
parameter is set to true.""")
@DefaultValue("IAU_DIMORPHOS")
String bodyFrame();
@DefaultValue("IAU_DIMORPHOS")
String bodyFrame();
@Comment("Target body name.")
@DefaultValue("DIMORPHOS")
String bodyName();
@Comment("Target body name.")
@DefaultValue("DIMORPHOS")
String bodyName();
@Comment("""
@Comment(
"""
Extend CK past the last sumFile by this number of seconds. Default
is zero. Attitude is assumed to be fixed to the value given by the
last sumfile.""")
@DefaultValue("0")
double extend();
@DefaultValue("0")
double extend();
@Comment("""
@Comment(
"""
SPC defines the camera X axis to be increasing to the right, Y to
be increasing down, and Z to point into the page:
@@ -74,48 +77,48 @@ public interface CKFromSumFileConfig {
(flipX, flipY, flipZ) = ( 2,-1, 3) SPICE frame is camera frame rotated 90 degrees about Z.
(flipX, flipY, flipZ) = (-2, 1, 3) SPICE frame is camera frame rotated -90 degrees about Z.
(flipX, flipY, flipZ) = ( 1,-2,-3) rotates the image 180 degrees about X.""")
@DefaultValue("-1")
int flipX();
@DefaultValue("-1")
int flipX();
@Comment("Map the camera Y axis to a SPICE axis. See flipX for details.")
@DefaultValue("2")
int flipY();
@Comment("Map the camera Y axis to a SPICE axis. See flipX for details.")
@DefaultValue("2")
int flipY();
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
@DefaultValue("-3")
int flipZ();
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
@DefaultValue("-3")
int flipZ();
@Comment("""
@Comment(
"""
Supply this frame kernel to MSOPCK. Only needed if the reference frame
(set by bodyFrame or J2000) is not built into SPICE""")
@DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf")
String fk();
@DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf")
String fk();
@Comment("Instrument frame name")
@DefaultValue("DART_DRACO")
String instrumentFrameName();
@Comment("Instrument frame name")
@DefaultValue("DART_DRACO")
String instrumentFrameName();
@Comment("If set to true, use J2000 as the reference frame")
@DefaultValue("true")
boolean J2000();
@Comment("If set to true, use J2000 as the reference frame")
@DefaultValue("true")
boolean J2000();
@Comment("Path to leapseconds kernel.")
@DefaultValue("/project/dart/data/SPICE/flight/lsk/naif0012.tls")
String lsk();
@Comment("Path to leapseconds kernel.")
@DefaultValue("/project/dart/data/SPICE/flight/lsk/naif0012.tls")
String lsk();
@Comment("Path to spacecraft SCLK file.")
@DefaultValue("/project/dart/data/SPICE/flight/sclk/dart_sclk_0204.tsc")
String sclk();
@Comment("Path to spacecraft SCLK file.")
@DefaultValue("/project/dart/data/SPICE/flight/sclk/dart_sclk_0204.tsc")
String sclk();
@Comment("Name of spacecraft frame.")
@DefaultValue("DART_SPACECRAFT")
String spacecraftFrame();
@Comment("Name of spacecraft frame.")
@DefaultValue("DART_SPACECRAFT")
String spacecraftFrame();
@Comment("""
@Comment(
"""
SPICE metakernel to read. This may be specified more than once
for multiple metakernels.""")
@DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm")
List<String> metakernel();
@DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm")
List<String> metakernel();
}

View File

@@ -29,230 +29,233 @@ import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.spi.StandardLevel;
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
import terrasaur.utils.Log4j2Configurator;
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
public class CommandLineOptions {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
/**
* Configuration file to load
*
* @return
*/
public static Option addConfig() {
return Option.builder("config").hasArg().required().desc("Configuration file to load").build();
}
/**
* Configuration file to load
*
* @return
*/
public static Option addConfig() {
return Option.builder("config")
.hasArg()
.required()
.desc("Configuration file to load")
.build();
}
/**
* Color ramp style. See {@link ColorRamp.TYPE} for valid values.
*
* @param defaultCRType
* @return
*/
public static Option addColorRamp(ColorRamp.TYPE defaultCRType) {
StringBuilder sb = new StringBuilder();
for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name()));
return Option.builder("colorRamp")
.hasArg()
.desc(
"Color ramp style. Valid values are "
+ sb.toString().trim()
+ ". Default is "
+ defaultCRType.name()
+ ". Run the ColorMaps application to see all supported color ramps.")
.build();
}
/**
* Color ramp style. See {@link ColorRamp.TYPE} for valid values.
*
* @param defaultCRType
* @return
*/
public static Option addColorRamp(ColorRamp.TYPE defaultCRType) {
StringBuilder sb = new StringBuilder();
for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name()));
return Option.builder("colorRamp")
.hasArg()
.desc("Color ramp style. Valid values are "
+ sb.toString().trim()
+ ". Default is "
+ defaultCRType.name()
+ ". Run the ColorMaps application to see all supported color ramps.")
.build();
}
/**
* Return a color ramp type or the default value.
*
* @param cl
* @param defaultCRType
* @return
*/
public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) {
ColorRamp.TYPE crType =
cl.hasOption("colorRamp")
? ColorRamp.TYPE.valueOf(cl.getOptionValue("colorRamp").toUpperCase().strip())
: defaultCRType;
return crType;
}
/**
* Return a color ramp type or the default value.
*
* @param cl
* @param defaultCRType
* @return
*/
public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) {
ColorRamp.TYPE crType = cl.hasOption("colorRamp")
? ColorRamp.TYPE.valueOf(
cl.getOptionValue("colorRamp").toUpperCase().strip())
: defaultCRType;
return crType;
}
/**
* Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be
* lower than hardMin.
*
* @return
*/
public static Option addHardMin() {
return Option.builder("hardMin")
.hasArg()
.desc(
"Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be lower than hardMin.")
.build();
}
/**
* Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be
* lower than hardMin.
*
* @return
*/
public static Option addHardMin() {
return Option.builder("hardMin")
.hasArg()
.desc(
"Hard lower limit for color bar. If the color bar minimum is set dynamically it will not be lower than hardMin.")
.build();
}
/**
* Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be
* higher than hardMax.
*
* @return
*/
public static Option addHardMax() {
return Option.builder("hardMax")
.hasArg()
.desc(
"Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be higher than hardMax.")
.build();
}
/**
* Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be
* higher than hardMax.
*
* @return
*/
public static Option addHardMax() {
return Option.builder("hardMax")
.hasArg()
.desc(
"Hard upper limit for color bar. If the color bar maximum is set dynamically it will not be higher than hardMax.")
.build();
}
/**
* Get the hard minimum for the colorbar.
*
* @param cl
* @return
*/
public static double getHardMin(CommandLine cl) {
return cl.hasOption("hardMin") ? Double.parseDouble(cl.getOptionValue("hardMin")) : Double.NaN;
}
/**
* Get the hard minimum for the colorbar.
*
* @param cl
* @return
*/
public static double getHardMin(CommandLine cl) {
return cl.hasOption("hardMin") ? Double.parseDouble(cl.getOptionValue("hardMin")) : Double.NaN;
}
/**
* Get the hard maximum for the colorbar.
*
* @param cl
* @return
*/
public static double getHardMax(CommandLine cl) {
return cl.hasOption("hardMax") ? Double.parseDouble(cl.getOptionValue("hardMax")) : Double.NaN;
}
/**
* Get the hard maximum for the colorbar.
*
* @param cl
* @return
*/
public static double getHardMax(CommandLine cl) {
return cl.hasOption("hardMax") ? Double.parseDouble(cl.getOptionValue("hardMax")) : Double.NaN;
}
/**
* If present, save screen output to log file.
*
* @return
*/
public static Option addLogFile() {
return Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build();
}
/**
* If present, save screen output to log file.
*
* @return
*/
public static Option addLogFile() {
return Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build();
}
/**
* If present, print messages above selected priority. See {@link StandardLevel} for valid values.
*
* @return
*/
public static Option addLogLevel() {
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
return Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build();
}
/**
* If present, print messages above selected priority. See {@link StandardLevel} for valid values.
*
* @return
*/
public static Option addLogLevel() {
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
return Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build();
}
/**
* Set the logging level from the command line option.
*
* @param cl
*/
public static void setLogLevel(CommandLine cl) {
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel"))
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
}
/**
* Set the logging level from the command line option.
*
* @param cl
*/
public static void setLogLevel(CommandLine cl) {
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel"))
lc.setLevel(
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
}
/**
* Log to file named on the command line as well as others
*
* @param cl
* @param others
*/
public static void setLogFile(CommandLine cl, String... others) {
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
for (String other : others) lc.addFile(other);
}
/**
* Log to file named on the command line as well as others
*
* @param cl
* @param others
*/
public static void setLogFile(CommandLine cl, String... others) {
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
for (String other : others) lc.addFile(other);
}
/**
* Maximum number of simultaneous threads to execute.
*
* @return
*/
public static Option addNumCPU() {
return Option.builder("numCPU")
.hasArg()
.desc(
"Maximum number of simultaneous threads to execute. Default is numCPU value in configuration file.")
.build();
}
/**
* Maximum number of simultaneous threads to execute.
*
* @return
*/
public static Option addNumCPU() {
return Option.builder("numCPU")
.hasArg()
.desc(
"Maximum number of simultaneous threads to execute. Default is numCPU value in configuration file.")
.build();
}
/**
* Directory to place output files. Default is the working directory.
*
* @return
*/
public static Option addOutputDir() {
return Option.builder("outputDir")
.hasArg()
.desc("Directory to place output files. Default is the working directory.")
.build();
}
/**
* Directory to place output files. Default is the working directory.
*
* @return
*/
public static Option addOutputDir() {
return Option.builder("outputDir")
.hasArg()
.desc("Directory to place output files. Default is the working directory.")
.build();
}
/**
* Set the output dir from the command line argument.
*
* @param cl
* @return
*/
public static String setOutputDir(CommandLine cl) {
String path = cl.hasOption("outputDir") ? cl.getOptionValue("outputDir") : ".";
File parent = new File(path);
if (!parent.exists()) parent.mkdirs();
return path;
}
/**
* Set the output dir from the command line argument.
*
* @param cl
* @return
*/
public static String setOutputDir(CommandLine cl) {
String path = cl.hasOption("outputDir") ? cl.getOptionValue("outputDir") : ".";
File parent = new File(path);
if (!parent.exists()) parent.mkdirs();
return path;
}
/**
* Minimum value to plot.
*
* @return
*/
public static Option addPlotMin() {
return Option.builder("plotMin").hasArg().desc("Min value to plot.").build();
}
/**
* Minimum value to plot.
*
* @return
*/
public static Option addPlotMin() {
return Option.builder("plotMin").hasArg().desc("Min value to plot.").build();
}
/**
* Get plot min from command line argument
*
* @param cl
* @return
*/
public static double getPlotMin(CommandLine cl) {
return cl.hasOption("plotMin") ? Double.parseDouble(cl.getOptionValue("plotMin")) : Double.NaN;
}
/**
* Get plot min from command line argument
*
* @param cl
* @return
*/
public static double getPlotMin(CommandLine cl) {
return cl.hasOption("plotMin") ? Double.parseDouble(cl.getOptionValue("plotMin")) : Double.NaN;
}
/**
* Maximum value to plot.
*
* @return
*/
public static Option addPlotMax() {
return Option.builder("plotMax").hasArg().desc("Max value to plot.").build();
}
/**
* Maximum value to plot.
*
* @return
*/
public static Option addPlotMax() {
return Option.builder("plotMax").hasArg().desc("Max value to plot.").build();
}
/**
* Get plot max from command line argument
*
* @param cl
* @return
*/
public static double getPlotMax(CommandLine cl) {
return cl.hasOption("plotMax") ? Double.parseDouble(cl.getOptionValue("plotMax")) : Double.NaN;
}
/**
* Get plot max from command line argument
*
* @param cl
* @return
*/
public static double getPlotMax(CommandLine cl) {
return cl.hasOption("plotMax") ? Double.parseDouble(cl.getOptionValue("plotMax")) : Double.NaN;
}
}

View File

@@ -30,16 +30,16 @@ import jackfruit.annotations.Jackfruit;
@Jackfruit
public interface ConfigBlock {
String introLines =
"""
String introLines =
"""
###############################################################################
# GENERAL PARAMETERS
###############################################################################
""";
@Comment(
introLines
+ """
@Comment(
introLines
+ """
Set the logging level. Valid values in order of increasing detail:
OFF
FATAL
@@ -50,28 +50,28 @@ public interface ConfigBlock {
TRACE
ALL
See org.apache.logging.log4j.Level.""")
@DefaultValue("INFO")
String logLevel();
@DefaultValue("INFO")
String logLevel();
@Comment(
"Format for log messages. See https://logging.apache.org/log4j/2.x/manual/layouts.html#PatternLayout for more details.")
@DefaultValue("%highlight{%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%c{1}:%L] %msg%n%throwable}")
String logFormat();
@Comment(
"Format for log messages. See https://logging.apache.org/log4j/2.x/manual/layouts.html#PatternLayout for more details.")
@DefaultValue("%highlight{%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%c{1}:%L] %msg%n%throwable}")
String logFormat();
@Comment(
"""
@Comment(
"""
Format for time strings. Allowed values are:
C (e.g. 1986 APR 12 16:31:09.814)
D (e.g. 1986-102 // 16:31:12.814)
J (e.g. 2446533.18834276)
ISOC (e.g. 1986-04-12T16:31:12.814)
ISOD (e.g. 1986-102T16:31:12.814)""")
@DefaultValue("ISOC")
String timeFormat();
@DefaultValue("ISOC")
String timeFormat();
@Include
MissionBlock missionBlock();
@Include
MissionBlock missionBlock();
@Include
SPCBlock spcBlock();
@Include
SPCBlock spcBlock();
}

View File

@@ -25,35 +25,34 @@ package terrasaur.config;
import jackfruit.annotations.Comment;
import jackfruit.annotations.DefaultValue;
import jackfruit.annotations.Jackfruit;
import java.util.List;
@Jackfruit(prefix = "mission")
public interface MissionBlock {
String introLines =
"""
String introLines =
"""
###############################################################################
# MISSION PARAMETERS
###############################################################################
""";
@Comment(introLines + "Mission name (e.g. DART)")
@DefaultValue("mission")
String missionName();
@Comment(introLines + "Mission name (e.g. DART)")
@DefaultValue("mission")
String missionName();
@Comment(
"""
@Comment(
"""
SPICE metakernel to read. This may be specified more than once
for multiple metakernels (e.g. /project/dart/data/SPICE/flight/mk/current.tm)""")
@DefaultValue("metakernel.tm")
List<String> metakernel();
@DefaultValue("metakernel.tm")
List<String> metakernel();
@Comment("Name of spacecraft frame (e.g. DART_SPACECRAFT)")
@DefaultValue("SPACECRAFT_FRAME")
String spacecraftFrame();
@Comment("Name of spacecraft frame (e.g. DART_SPACECRAFT)")
@DefaultValue("SPACECRAFT_FRAME")
String spacecraftFrame();
@Comment("Instrument frame name (e.g. DART_DRACO)")
@DefaultValue("INSTRUMENT_FRAME")
String instrumentFrameName();
@Comment("Instrument frame name (e.g. DART_DRACO)")
@DefaultValue("INSTRUMENT_FRAME")
String instrumentFrameName();
}

View File

@@ -29,14 +29,16 @@ import jackfruit.annotations.Jackfruit;
@Jackfruit(prefix = "spc")
public interface SPCBlock {
String introLines =
String introLines =
"""
###############################################################################
# SPC PARAMETERS
###############################################################################
""";
@Comment(introLines + """
@Comment(
introLines
+ """
SPC defines the camera X axis to be increasing to the right, Y to
be increasing down, and Z to point into the page:
@@ -72,5 +74,4 @@ public interface SPCBlock {
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
@DefaultValue("-3")
int flipZ();
}

View File

@@ -41,71 +41,68 @@ import terrasaur.utils.AppVersion;
public class TerrasaurConfig {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
private static TerrasaurConfig instance = null;
private static TerrasaurConfig instance = null;
private TerrasaurConfig() {}
private TerrasaurConfig() {}
private ConfigBlock configBlock;
private ConfigBlock configBlock;
public static ConfigBlock getConfig() {
if (instance == null) {
logger.error("Configuration has not been loaded! Returning null.");
return null;
public static ConfigBlock getConfig() {
if (instance == null) {
logger.error("Configuration has not been loaded! Returning null.");
return null;
}
return instance.configBlock;
}
return instance.configBlock;
}
public static ConfigBlock getTemplate() {
if (instance == null) {
instance = new TerrasaurConfig();
ConfigBlockFactory factory = new ConfigBlockFactory();
instance.configBlock = factory.getTemplate();
public static ConfigBlock getTemplate() {
if (instance == null) {
instance = new TerrasaurConfig();
ConfigBlockFactory factory = new ConfigBlockFactory();
instance.configBlock = factory.getTemplate();
}
return instance.configBlock;
}
return instance.configBlock;
}
public static ConfigBlock load(Path filename) {
if (!Files.exists(filename)) {
System.err.println("Cannot load configuration file " + filename);
Thread.dumpStack();
System.exit(1);
public static ConfigBlock load(Path filename) {
if (!Files.exists(filename)) {
System.err.println("Cannot load configuration file " + filename);
Thread.dumpStack();
System.exit(1);
}
if (instance == null) {
instance = new TerrasaurConfig();
try {
PropertiesConfiguration config = new Configurations().properties(filename.toFile());
instance.configBlock = new ConfigBlockFactory().fromConfig(config);
} catch (ConfigurationException e) {
e.printStackTrace();
}
}
return instance.configBlock;
}
if (instance == null) {
instance = new TerrasaurConfig();
try {
PropertiesConfiguration config = new Configurations().properties(filename.toFile());
instance.configBlock = new ConfigBlockFactory().fromConfig(config);
} catch (ConfigurationException e) {
e.printStackTrace();
}
@Override
public String toString() {
StringWriter string = new StringWriter();
try (PrintWriter pw = new PrintWriter(string)) {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock);
PropertiesConfigurationLayout layout = config.getLayout();
String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC)
.format(Instant.now());
layout.setHeaderComment(
String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw);
} catch (ConfigurationException | IOException e) {
e.printStackTrace();
}
return string.toString();
}
return instance.configBlock;
}
@Override
public String toString() {
StringWriter string = new StringWriter();
try (PrintWriter pw = new PrintWriter(string)) {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock);
PropertiesConfigurationLayout layout = config.getLayout();
String now =
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC)
.format(Instant.now());
layout.setHeaderComment(
String.format(
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw);
} catch (ConfigurationException | IOException e) {
e.printStackTrace();
}
return string.toString();
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -25,58 +25,64 @@ package terrasaur.enums;
import org.apache.commons.io.FilenameUtils;
public enum FORMATS {
ASCII(true),
BIN3(true),
BIN4(true),
BIN7(true),
FITS(false),
ICQ(false),
OBJ(false),
PLT(false),
PLY(false),
VTK(false);
ASCII(true), BIN3(true), BIN4(true), BIN7(true), FITS(false), ICQ(false), OBJ(false), PLT(
false), PLY(false), VTK(false);
/** True if this format contains no facet information */
public boolean pointsOnly;
/** True if this format contains no facet information */
public boolean pointsOnly;
private FORMATS(boolean pointsOnly) {
this.pointsOnly = pointsOnly;
}
/**
* Guess the format from the (case-insensitive) filename extension.
* <p>
* ASCII: ascii, txt, xyz
* <p>
* BINARY: binary, bin
* <p>
* FITS: fits, fit
* <p>
* L2: l2, dat
* <p>
* OBJ: obj
* <p>
* PLT: plt
* <p>
* PLY: ply
* <p>
* VTK: vtk
*
* @param filename
* @return matched format type, or null if a match is not found
*/
public static FORMATS formatFromExtension(String filename) {
String extension = FilenameUtils.getExtension(filename);
for (FORMATS f : FORMATS.values()) {
if (f.name().equalsIgnoreCase(extension)) {
return f;
}
private FORMATS(boolean pointsOnly) {
this.pointsOnly = pointsOnly;
}
switch (extension.toUpperCase()) {
case "TXT":
case "XYZ":
return FORMATS.ASCII;
case "BIN":
return FORMATS.BIN3;
case "FIT":
return FORMATS.FITS;
/**
* Guess the format from the (case-insensitive) filename extension.
* <p>
* ASCII: ascii, txt, xyz
* <p>
* BINARY: binary, bin
* <p>
* FITS: fits, fit
* <p>
* L2: l2, dat
* <p>
* OBJ: obj
* <p>
* PLT: plt
* <p>
* PLY: ply
* <p>
* VTK: vtk
*
* @param filename
* @return matched format type, or null if a match is not found
*/
public static FORMATS formatFromExtension(String filename) {
String extension = FilenameUtils.getExtension(filename);
for (FORMATS f : FORMATS.values()) {
if (f.name().equalsIgnoreCase(extension)) {
return f;
}
}
switch (extension.toUpperCase()) {
case "TXT":
case "XYZ":
return FORMATS.ASCII;
case "BIN":
return FORMATS.BIN3;
case "FIT":
return FORMATS.FITS;
}
return null;
}
return null;
}
}

View File

@@ -25,31 +25,38 @@ package terrasaur.enums;
/**
* Enums for a given fits header format. This is used to keep fits headers for the different types
* separately configurable.
*
*
* @author espirrc1
*
*/
public enum FitsHeaderType {
ANCIGLOBALGENERIC,
ANCILOCALGENERIC,
ANCIGLOBALALTWG,
ANCIG_FACETRELATION_ALTWG,
ANCILOCALALTWG,
DTMGLOBALALTWG,
DTMLOCALALTWG,
DTMGLOBALGENERIC,
DTMLOCALGENERIC,
NFTMLN;
ANCIGLOBALGENERIC, ANCILOCALGENERIC, ANCIGLOBALALTWG, ANCIG_FACETRELATION_ALTWG, ANCILOCALALTWG, DTMGLOBALALTWG, DTMLOCALALTWG, DTMGLOBALGENERIC, DTMLOCALGENERIC, NFTMLN;
public static boolean isGLobal(FitsHeaderType hdrType) {
public static boolean isGLobal(FitsHeaderType hdrType) {
boolean globalProduct;
boolean globalProduct;
switch (hdrType) {
case ANCIGLOBALALTWG:
case ANCIGLOBALGENERIC:
case DTMGLOBALALTWG:
case DTMGLOBALGENERIC:
globalProduct = true;
break;
switch (hdrType) {
default:
globalProduct = false;
}
case ANCIGLOBALALTWG:
case ANCIGLOBALGENERIC:
case DTMGLOBALALTWG:
case DTMGLOBALGENERIC:
globalProduct = true;
break;
default:
globalProduct = false;
return globalProduct;
}
return globalProduct;
}
}

View File

@@ -32,147 +32,143 @@ import nom.tam.fits.HeaderCardException;
* Enumeration containing the values and comments to use for FITS tags describing data stored in
* FITS data cubes. The enumeration name references the type of data stored in a given plane. This
* way the user can choose their own value for the FITS keyword (i.e. "PLANE1" or "PLANE10").
*
*
* @author espirrc1
*
*
*/
public enum PlaneInfo {
//@formatter:off
LAT("Latitude of vertices", "[deg]", "deg"),
LON("Longitude of vertices", "[deg]", "deg"),
RAD("Radius of vertices", "[km]", "km"),
X("X coordinate of vertices", "[km]", "km"),
Y("Y coordinate of vertices", "[km]", "km"),
Z("Z coordinate of vertices", "[km]", "km"),
NORM_VECTOR_X("Normal vector X", null, null),
NORM_VECTOR_Y("Normal vector Y", null, null),
NORM_VECTOR_Z("Normal vector Z", null, null),
GRAV_VECTOR_X("Gravity vector X", "[m/s^2]", "m/s**2"),
GRAV_VECTOR_Y("Gravity vector Y", "[m/s^2]", "m/s**2"),
GRAV_VECTOR_Z("Gravity vector Z", "[m/s^2]", "m/s**2"),
GRAV_MAG("Gravitational magnitude", "[m/s^2]", "m/s**2"),
GRAV_POT("Gravitational potential", "[J/kg]", "J/kg"),
ELEV("Elevation", "[m]", "m"),
AREA("Area", "[km^2]", "km**2"),
// @formatter:off
LAT("Latitude of vertices", "[deg]", "deg"),
LON("Longitude of vertices", "[deg]", "deg"),
RAD("Radius of vertices", "[km]", "km"),
X("X coordinate of vertices", "[km]", "km"),
Y("Y coordinate of vertices", "[km]", "km"),
Z("Z coordinate of vertices", "[km]", "km"),
NORM_VECTOR_X("Normal vector X", null, null),
NORM_VECTOR_Y("Normal vector Y", null, null),
NORM_VECTOR_Z("Normal vector Z", null, null),
GRAV_VECTOR_X("Gravity vector X", "[m/s^2]", "m/s**2"),
GRAV_VECTOR_Y("Gravity vector Y", "[m/s^2]", "m/s**2"),
GRAV_VECTOR_Z("Gravity vector Z", "[m/s^2]", "m/s**2"),
GRAV_MAG("Gravitational magnitude", "[m/s^2]", "m/s**2"),
GRAV_POT("Gravitational potential", "[J/kg]", "J/kg"),
ELEV("Elevation", "[m]", "m"),
AREA("Area", "[km^2]", "km**2"),
// no longer needed! same as HEIGHT!
// ELEV_NORM("Elevation relative to normal plane", "[m]", "m"),
SLOPE("Slope", "[deg]", "deg"),
SHADE("Shaded relief", null, null),
TILT("Facet tilt", "[deg]", "deg"),
TILT_DIRECTION("Facet tilt direction", "[deg]", "deg"),
TILT_AVERAGE("Mean tilt", "[deg]", "deg"),
TILT_VARIATION("Tilt variation", "[deg]", "deg"),
TILT_AVERAGE_DIRECTION("Mean tilt direction", "[deg]", "deg"),
TILT_DIRECTION_VARIATION("Tilt direction variation", "[deg]", "deg"),
TILT_RELATIVE("Relative tilt", "[deg]", "deg"),
TILT_RELATIVE_DIRECTION("Relative tilt direction", "[deg]", "deg"),
TILT_UNCERTAINTY("Tilt Uncertainty", "[deg]", "deg"),
FACETRAD("Facet radius", "[m]", "m"),
HEIGHT("Height relative to normal plane", "[km]", "km"),
RELATIVE_HEIGHT("Max relative height", "[km]", "km"),
ALBEDO("Relative albedo", null, null),
INTENSITY("Return Intensity", null, null),
SIGMA("Sigma", null, null),
QUALITY("Quality", null, null),
SHADED("Shaded relief", null, null),
NUMPOINTS("Number of OLA points used", null, null),
HEIGHT_RESIDUAL("Mean of residual between points and fitted height", "[km]", "km"),
HEIGHT_STDDEV("Std deviation of residual between points and fitted height", "[km]", "km"),
HAZARD("Hazard", "1 indicates a hazard to the spacecraft", null);
//@formatter:on
// no longer needed! same as HEIGHT!
// ELEV_NORM("Elevation relative to normal plane", "[m]", "m"),
SLOPE("Slope", "[deg]", "deg"),
SHADE("Shaded relief", null, null),
TILT("Facet tilt", "[deg]", "deg"),
TILT_DIRECTION("Facet tilt direction", "[deg]", "deg"),
TILT_AVERAGE("Mean tilt", "[deg]", "deg"),
TILT_VARIATION("Tilt variation", "[deg]", "deg"),
TILT_AVERAGE_DIRECTION("Mean tilt direction", "[deg]", "deg"),
TILT_DIRECTION_VARIATION("Tilt direction variation", "[deg]", "deg"),
TILT_RELATIVE("Relative tilt", "[deg]", "deg"),
TILT_RELATIVE_DIRECTION("Relative tilt direction", "[deg]", "deg"),
TILT_UNCERTAINTY("Tilt Uncertainty", "[deg]", "deg"),
FACETRAD("Facet radius", "[m]", "m"),
HEIGHT("Height relative to normal plane", "[km]", "km"),
RELATIVE_HEIGHT("Max relative height", "[km]", "km"),
ALBEDO("Relative albedo", null, null),
INTENSITY("Return Intensity", null, null),
SIGMA("Sigma", null, null),
QUALITY("Quality", null, null),
SHADED("Shaded relief", null, null),
NUMPOINTS("Number of OLA points used", null, null),
HEIGHT_RESIDUAL("Mean of residual between points and fitted height", "[km]", "km"),
HEIGHT_STDDEV("Std deviation of residual between points and fitted height", "[km]", "km"),
HAZARD("Hazard", "1 indicates a hazard to the spacecraft", null);
// @formatter:on
private String keyValue; // value associated with FITS keyword
private String comment; // comment associated with FITS keyword
private String units; // units associated with the plane. Usually in PDS4 nomenclature
private String keyValue; // value associated with FITS keyword
private String comment; // comment associated with FITS keyword
private String units; // units associated with the plane. Usually in PDS4 nomenclature
PlaneInfo(String keyVal, String comment, String units) {
this.keyValue = keyVal;
this.comment = comment;
this.units = units;
}
public String value() {
return keyValue;
}
public String comment() {
return comment;
}
public String units() {
return units;
}
/**
* Try to parse the enum from the given Keyval string. Needs to match exactly (but case
* insensitive)!
*
* @param keyVal
* @return
*/
public static PlaneInfo keyVal2Plane(String keyVal) {
for (PlaneInfo planeName : values()) {
if ((planeName.value() != null) && (planeName.value().equalsIgnoreCase(keyVal))) {
return planeName;
}
PlaneInfo(String keyVal, String comment, String units) {
this.keyValue = keyVal;
this.comment = comment;
this.units = units;
}
return null;
}
public static PlaneInfo planeFromString(String plane) {
for (PlaneInfo planeName : values()) {
if (planeName.toString().equals(plane)) {
return planeName;
}
public String value() {
return keyValue;
}
return null;
}
/*
* Create enumeration set for the first 6 planes. These are the initial planes created from the
* Osiris Rex netCDF file.
*/
public static final EnumSet<PlaneInfo> first6HTags = EnumSet.range(PlaneInfo.LAT, PlaneInfo.Z);
public static List<PlaneInfo> coreTiltPlanes() {
List<PlaneInfo> coreTilts = new ArrayList<PlaneInfo>();
coreTilts.add(PlaneInfo.TILT_AVERAGE);
coreTilts.add(PlaneInfo.TILT_VARIATION);
coreTilts.add(PlaneInfo.TILT_AVERAGE_DIRECTION);
coreTilts.add(PlaneInfo.TILT_DIRECTION_VARIATION);
coreTilts.add(PlaneInfo.TILT_RELATIVE);
coreTilts.add(PlaneInfo.TILT_RELATIVE_DIRECTION);
coreTilts.add(PlaneInfo.RELATIVE_HEIGHT);
return coreTilts;
}
/**
* Convert List<PlaneInfo> to List<HeaderCard> where each HeaderCard in List follows the
* convention: for each "thisPlane" in List<PlaneInfo> HeaderCard = new HeaderCard("PLANE" + cc,
* thisPlane.value(), thisPlane.comment()) The order in List<HeaderCard> follows the order in
* List<PlaneInfo>
*
* @param planeList
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList)
throws HeaderCardException {
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
String plane = "PLANE";
int cc = 1;
for (PlaneInfo thisPlane : planeList) {
planeHeaders.add(new HeaderCard(plane + cc, thisPlane.value(), thisPlane.comment()));
cc++;
public String comment() {
return comment;
}
return planeHeaders;
}
public String units() {
return units;
}
/**
* Try to parse the enum from the given Keyval string. Needs to match exactly (but case
* insensitive)!
*
* @param keyVal
* @return
*/
public static PlaneInfo keyVal2Plane(String keyVal) {
for (PlaneInfo planeName : values()) {
if ((planeName.value() != null) && (planeName.value().equalsIgnoreCase(keyVal))) {
return planeName;
}
}
return null;
}
public static PlaneInfo planeFromString(String plane) {
for (PlaneInfo planeName : values()) {
if (planeName.toString().equals(plane)) {
return planeName;
}
}
return null;
}
/*
* Create enumeration set for the first 6 planes. These are the initial planes created from the
* Osiris Rex netCDF file.
*/
public static final EnumSet<PlaneInfo> first6HTags = EnumSet.range(PlaneInfo.LAT, PlaneInfo.Z);
public static List<PlaneInfo> coreTiltPlanes() {
List<PlaneInfo> coreTilts = new ArrayList<PlaneInfo>();
coreTilts.add(PlaneInfo.TILT_AVERAGE);
coreTilts.add(PlaneInfo.TILT_VARIATION);
coreTilts.add(PlaneInfo.TILT_AVERAGE_DIRECTION);
coreTilts.add(PlaneInfo.TILT_DIRECTION_VARIATION);
coreTilts.add(PlaneInfo.TILT_RELATIVE);
coreTilts.add(PlaneInfo.TILT_RELATIVE_DIRECTION);
coreTilts.add(PlaneInfo.RELATIVE_HEIGHT);
return coreTilts;
}
/**
* Convert List<PlaneInfo> to List<HeaderCard> where each HeaderCard in List follows the
* convention: for each "thisPlane" in List<PlaneInfo> HeaderCard = new HeaderCard("PLANE" + cc,
* thisPlane.value(), thisPlane.comment()) The order in List<HeaderCard> follows the order in
* List<PlaneInfo>
*
* @param planeList
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList) throws HeaderCardException {
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
String plane = "PLANE";
int cc = 1;
for (PlaneInfo thisPlane : planeList) {
planeHeaders.add(new HeaderCard(plane + cc, thisPlane.value(), thisPlane.comment()));
cc++;
}
return planeHeaders;
}
}

View File

@@ -27,112 +27,109 @@ import com.google.common.base.Strings;
/**
* Enum for defining the types of sigma files that can be loaded and utilized by the Pipeline. This
* allows the pipeline to load and parse different formats of sigma files.
*
*
* @author espirrc1
*
*/
public enum SigmaFileType {
SPCSIGMA {
SPCSIGMA {
@Override
public String commentSymbol() {
return "";
}
@Override
public String stringArg() {
return "spc";
}
@Override
public int sigmaCol() {
return 3;
}
},
ERRORFROMSQLSIGMA {
@Override
public String commentSymbol() {
return "#";
}
@Override
public String stringArg() {
return "errorfromsql";
}
// should be the Standard Deviation column in ErrorFromSQL file.
public int sigmaCol() {
return 8;
}
},
NOMATCH {
@Override
public String commentSymbol() {
return "NAN";
}
@Override
public String stringArg() {
return "NAN";
}
public int sigmaCol() {
return -1;
}
};
// returns the symbol that is used to denote comment lines
public abstract String commentSymbol();
// input argument to match
public abstract String stringArg();
// column number where sigma values are stored
public abstract int sigmaCol();
public static SigmaFileType getFileType(String sigmaFileType) {
if (!Strings.isNullOrEmpty(sigmaFileType)) {
for (SigmaFileType thisType : values()) {
if (sigmaFileType.toLowerCase().equals(thisType.stringArg())) {
return thisType;
@Override
public String commentSymbol() {
return "";
}
}
}
return NOMATCH;
}
/**
* Return the SigmaFileType associated with the SrcProductType.
*
* @param srcType
* @return
*/
public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) {
SigmaFileType sigmaType = SigmaFileType.NOMATCH;
switch (srcType) {
@Override
public String stringArg() {
return "spc";
}
case SPC:
sigmaType = SigmaFileType.SPCSIGMA;
break;
@Override
public int sigmaCol() {
return 3;
}
},
case OLA:
sigmaType = SigmaFileType.ERRORFROMSQLSIGMA;
break;
ERRORFROMSQLSIGMA {
default:
sigmaType = SigmaFileType.NOMATCH;
break;
@Override
public String commentSymbol() {
return "#";
}
@Override
public String stringArg() {
return "errorfromsql";
}
// should be the Standard Deviation column in ErrorFromSQL file.
public int sigmaCol() {
return 8;
}
},
NOMATCH {
@Override
public String commentSymbol() {
return "NAN";
}
@Override
public String stringArg() {
return "NAN";
}
public int sigmaCol() {
return -1;
}
};
// returns the symbol that is used to denote comment lines
public abstract String commentSymbol();
// input argument to match
public abstract String stringArg();
// column number where sigma values are stored
public abstract int sigmaCol();
public static SigmaFileType getFileType(String sigmaFileType) {
if (!Strings.isNullOrEmpty(sigmaFileType)) {
for (SigmaFileType thisType : values()) {
if (sigmaFileType.toLowerCase().equals(thisType.stringArg())) {
return thisType;
}
}
}
return NOMATCH;
}
return sigmaType;
}
/**
* Return the SigmaFileType associated with the SrcProductType.
*
* @param srcType
* @return
*/
public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) {
SigmaFileType sigmaType = SigmaFileType.NOMATCH;
switch (srcType) {
case SPC:
sigmaType = SigmaFileType.SPCSIGMA;
break;
case OLA:
sigmaType = SigmaFileType.ERRORFROMSQLSIGMA;
break;
default:
sigmaType = SigmaFileType.NOMATCH;
break;
}
return sigmaType;
}
}

View File

@@ -25,91 +25,86 @@ package terrasaur.enums;
/**
* Enumeration storing the source product type: the product type of the source data used in creation
* of an ALTWG product.
*
*
* @author espirrc1
*
*/
public enum SrcProductType {
SFM {
SFM {
@Override
public String getAltwgFrag() {
return "sfm";
}
},
@Override
public String getAltwgFrag() {
return "sfm";
SPC {
@Override
public String getAltwgFrag() {
return "spc";
}
},
// OLA Altimetry
OLA {
@Override
public String getAltwgFrag() {
return "alt";
}
},
// SPC-OLA
SPO {
@Override
public String getAltwgFrag() {
return "spo";
}
},
TRUTH {
@Override
public String getAltwgFrag() {
return "tru";
}
},
UNKNOWN {
@Override
public String getAltwgFrag() {
return "unk";
}
};
public static SrcProductType getType(String value) {
value = value.toUpperCase();
for (SrcProductType srcType : values()) {
if (srcType.toString().equals(value)) {
return srcType;
}
}
return UNKNOWN;
}
},
/**
* Returns the string fragment associated with the source product type. Follows the ALTWG naming
* convention.
*
* @return
*/
public abstract String getAltwgFrag();
SPC {
@Override
public String getAltwgFrag() {
return "spc";
/**
* Return the SrcProductType whose getAltwgFrag() string matches the stringFrag. Return UNKNOWN if
* a match is not found.
*
* @param stringFrag
* @return
*/
public static SrcProductType fromAltwgFrag(String stringFrag) {
for (SrcProductType prodType : SrcProductType.values()) {
if (prodType.getAltwgFrag().equals(stringFrag)) return prodType;
}
return UNKNOWN;
}
},
// OLA Altimetry
OLA {
@Override
public String getAltwgFrag() {
return "alt";
}
},
// SPC-OLA
SPO {
@Override
public String getAltwgFrag() {
return "spo";
}
},
TRUTH {
@Override
public String getAltwgFrag() {
return "tru";
}
},
UNKNOWN {
@Override
public String getAltwgFrag() {
return "unk";
}
};
public static SrcProductType getType(String value) {
value = value.toUpperCase();
for (SrcProductType srcType : values()) {
if (srcType.toString().equals(value)) {
return srcType;
}
}
return UNKNOWN;
}
/**
* Returns the string fragment associated with the source product type. Follows the ALTWG naming
* convention.
*
* @return
*/
public abstract String getAltwgFrag();
/**
* Return the SrcProductType whose getAltwgFrag() string matches the stringFrag. Return UNKNOWN if
* a match is not found.
*
* @param stringFrag
* @return
*/
public static SrcProductType fromAltwgFrag(String stringFrag) {
for (SrcProductType prodType : SrcProductType.values()) {
if (prodType.getAltwgFrag().equals(stringFrag))
return prodType;
}
return UNKNOWN;
}
}

View File

@@ -27,298 +27,344 @@ import java.util.Map;
public enum AltPipelnEnum {
// settings enums. Used to determine type of product to create.
ANCIGLOBAL,
// settings enums. Used to determine type of product to create.
ANCIGLOBAL,
// optional: tells code to skip generation of products for highest res shape model
SKIPORIGINALSHP,
// optional: tells code to skip generation of products for highest res shape model
SKIPORIGINALSHP,
// controls whether or not certain global products get created.
// REQUIRED TO BE IN CONFIGFILE:
DOGLOBALSHAPE, OBJTOFITS, ADDOBJCOMMENTS, GLOBALRES0, DUMBERVALUES, DOGRAVGLOBAL, DOGLOBALGRAV_ANCI, DOGLOBALTILT_ANCI, DOGLOBALMISC_ANCI, DOGLOBALTILT,
// controls whether or not certain global products get created.
// REQUIRED TO BE IN CONFIGFILE:
DOGLOBALSHAPE,
OBJTOFITS,
ADDOBJCOMMENTS,
GLOBALRES0,
DUMBERVALUES,
DOGRAVGLOBAL,
DOGLOBALGRAV_ANCI,
DOGLOBALTILT_ANCI,
DOGLOBALMISC_ANCI,
DOGLOBALTILT,
// controls number of slots per job to use when running global distributed gravity
// in grid engine mode. Does not apply if running in local parallel mode
GLOBALGRAVSLOTS,
// controls number of slots per job to use when running global distributed gravity
// in grid engine mode. Does not apply if running in local parallel mode
GLOBALGRAVSLOTS,
// controls number of slots per job to use when running local distributed gravity
// in grid engine mode. Does not apply if running in local parallel mode
LOCALGRAVSLOTS,
// controls number of slots per job to use when running local distributed gravity
// in grid engine mode. Does not apply if running in local parallel mode
LOCALGRAVSLOTS,
// full path to SPICE metakernel to use when generating DSK products
DSKERNEL,
// full path to SPICE metakernel to use when generating DSK products
DSKERNEL,
// enable/disable the creation of global and local DSK files
DOGLOBALDSK, DOLOCALDSK,
// enable/disable the creation of global and local DSK files
DOGLOBALDSK,
DOLOCALDSK,
// global tilt semi-major axis in km. Now a required variable
GTILTMAJAXIS,
// global tilt semi-major axis in km. Now a required variable
GTILTMAJAXIS,
// settings for every local product generated by the pipeline
USEOLDMAPLETS, DODTMFITSOBJ, DOGRAVLOCAL, GENLOCALGRAV, DOLOCALTILT, DOLOCAL_GRAVANCI, DOLOCAL_TILTANCI, DOLOCAL_MISCANCI,
// settings for every local product generated by the pipeline
USEOLDMAPLETS,
DODTMFITSOBJ,
DOGRAVLOCAL,
GENLOCALGRAV,
DOLOCALTILT,
DOLOCAL_GRAVANCI,
DOLOCAL_TILTANCI,
DOLOCAL_MISCANCI,
// controls whether to use average gravitational potential for global and local gravity
// calculations. =0 use minimum reference potential, =1 use average reference potential
AVGGRAVPOTGLOBAL, AVGGRAVPOTLOCAL,
// controls whether to use average gravitational potential for global and local gravity
// calculations. =0 use minimum reference potential, =1 use average reference potential
AVGGRAVPOTGLOBAL,
AVGGRAVPOTLOCAL,
// controls RunBigMap.
// integrate slope to height required. Defaults to "n" if this enum does not exist in config file.
// otherwise will evaluate value. 0="n", 1="y"
INTSLOPE,
// controls RunBigMap.
// integrate slope to height required. Defaults to "n" if this enum does not exist in config file.
// otherwise will evaluate value. 0="n", 1="y"
INTSLOPE,
// use grotesque model in RunBigMap. Defaults to not using it if this enum does not exist in
// config file
// otherwise will evaluate value. 0="do not use grotesque model", 1="do use grotesque model"
USEGROTESQUE,
// use grotesque model in RunBigMap. Defaults to not using it if this enum does not exist in
// config file
// otherwise will evaluate value. 0="do not use grotesque model", 1="do use grotesque model"
USEGROTESQUE,
// controls the source data, product destination, naming convention,
// and process flow.
DATASOURCE, REPORTTYPE, INDATADIR, OUTPUTDIR, PDSNAMING, RENAMEGLOBALOBJ, USEBIGMAPS, DOREPORT,
// controls the source data, product destination, naming convention,
// and process flow.
DATASOURCE,
REPORTTYPE,
INDATADIR,
OUTPUTDIR,
PDSNAMING,
RENAMEGLOBALOBJ,
USEBIGMAPS,
DOREPORT,
// shape model density and rotation rate are now required variables. This way we can easily spot
// what we are using
// as defaults.
SMDENSITY, SMRRATE,
// shape model density and rotation rate are now required variables. This way we can easily spot
// what we are using
// as defaults.
SMDENSITY,
SMRRATE,
// stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct.
NAMINGCONVENTION,
// stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct.
NAMINGCONVENTION,
// set values that cannot be derived from data.
REPORTBASENAME, VERSION,
// set values that cannot be derived from data.
REPORTBASENAME,
VERSION,
// everything after this is not a required keyword
// everything after this is not a required keyword
// (Optional) controls whether there is an external body that needs to be accounted for when
// running gravity code
// the values should be a csv string with no spaces. THe values are: mass(kg),x,y,z where x,y,z
// are the body
// fixed coordinates in km.
// e.g. 521951167,1.19,0,0
EXTERNALBODY,
// (Optional) controls whether there is an external body that needs to be accounted for when
// running gravity code
// the values should be a csv string with no spaces. THe values are: mass(kg),x,y,z where x,y,z
// are the body
// fixed coordinates in km.
// e.g. 521951167,1.19,0,0
EXTERNALBODY,
// (Optional). If the keyword exists and value is 1 then no GLOBAL DTMs are assumed to be created.
// for example, in the DART Derived Product set we are not creating g_*dtm*.fits files
NOGLOBALDTM,
// (Optional). If the keyword exists then evaluate the shapes to process by parsing the
// comma-separated values. Ex.values are 0,1,2 then pipeline will assume it has to
// process shape0, shape1, shape2. The pipeline will also disregard the values in
// DUMBERVALUES that otherwise determine how many shape files to process.
SHAPE2PROC,
// (optional) controls whether or not STL files get generated. If these do not exist in the
// pipeConfig file then they will NOT
// get generated!
GLOBALSTL, LOCALSTL,
// (Optional). If the keyword exists and value is 1 then no GLOBAL DTMs are assumed to be created.
// for example, in the DART Derived Product set we are not creating g_*dtm*.fits files
NOGLOBALDTM,
// keywords for local products
//
// MAPSmallerSZ: resize local DTMs to a different half size. For pipeline we may want to generate
// DTMs at halfsize + tilt radius then resize the DTMs to halfsize in order to have tilts
// evaluated with the full range of points at the edges.
//
// MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps.
// defaults to auto-generated tiles if this is not specified.
// allow for pointers to different files for 30cm and 10cm map products.
DOLOCALMAP, MAPDIR, MAPSCALE, MAPHALFSZ, REPORT, MAPSmallerSZ, MAPFILE, ISTAGSITE, LTILTMAJAXIS, LTILTMINAXIS, LTILTPA, MAXSCALE,
// (Optional). If the keyword exists then evaluate the shapes to process by parsing the
// comma-separated values. Ex.values are 0,1,2 then pipeline will assume it has to
// process shape0, shape1, shape2. The pipeline will also disregard the values in
// DUMBERVALUES that otherwise determine how many shape files to process.
SHAPE2PROC,
// settings for local tag sites. Note TAGSFILE is not optional.
// it contains the tagsite name and lat,lon of tag site tile center
TAGDIR, TAGSCALE, TAGHALFSZ, TAGSFILE, REPORTTAG,
// (optional) controls whether or not STL files get generated. If these do not exist in the
// pipeConfig file then they will NOT
// get generated!
GLOBALSTL,
LOCALSTL,
// pointer to OLA database. only required if DATASOURCE is OLA
OLADBPATH,
// keywords for local products
//
// MAPSmallerSZ: resize local DTMs to a different half size. For pipeline we may want to generate
// DTMs at halfsize + tilt radius then resize the DTMs to halfsize in order to have tilts
// evaluated with the full range of points at the edges.
//
// MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps.
// defaults to auto-generated tiles if this is not specified.
// allow for pointers to different files for 30cm and 10cm map products.
DOLOCALMAP,
MAPDIR,
MAPSCALE,
MAPHALFSZ,
REPORT,
MAPSmallerSZ,
MAPFILE,
ISTAGSITE,
LTILTMAJAXIS,
LTILTMINAXIS,
LTILTPA,
MAXSCALE,
// force sigma files to all be NaN
FORCESIGMA_NAN,
// settings for local tag sites. Note TAGSFILE is not optional.
// it contains the tagsite name and lat,lon of tag site tile center
TAGDIR,
TAGSCALE,
TAGHALFSZ,
TAGSFILE,
REPORTTAG,
// global sigma scale factor
SIGMA_SCALEFACTOR,
// pointer to OLA database. only required if DATASOURCE is OLA
OLADBPATH,
// local sigma scale factor
LOCAL_SIGMA_SCALEFACTOR,
// force sigma files to all be NaN
FORCESIGMA_NAN,
// SIGMA file type. No longer tied to DataSource!
SIGMAFILE_TYPE,
// global sigma scale factor
SIGMA_SCALEFACTOR,
// force the Report page to be HTML. Default is created at PHP
REPORTASHTML,
// local sigma scale factor
LOCAL_SIGMA_SCALEFACTOR,
/*
* The following are used to change default values used by the pipeline these are the shape model
* density, rotation rate, gravitational algorithm, gravitational constant, global average
* reference potential, local average reference potential. Added values defining the tilt ellipse
* to use when evaluating tilts. Note the different enums for global versus local tilt ellipse
* parameters. The pipeline will use default values for these enums if they are not defined in the
* pipeline configuration file.
*/
GALGORITHM, GRAVCONST, GTILTMINAXIS, GTILTPA, MASSUNCERT, VSEARCHRAD_PCTGSD, FSEARCHRAD_PCTGSD, PXPERDEG,
// SIGMA file type. No longer tied to DataSource!
SIGMAFILE_TYPE,
// The following are options to subvert normal pipeline operations or to configure pipeline for
// other missions
// force the Report page to be HTML. Default is created at PHP
REPORTASHTML,
/*
* The following are used to change default values used by the pipeline these are the shape model
* density, rotation rate, gravitational algorithm, gravitational constant, global average
* reference potential, local average reference potential. Added values defining the tilt ellipse
* to use when evaluating tilts. Note the different enums for global versus local tilt ellipse
* parameters. The pipeline will use default values for these enums if they are not defined in the
* pipeline configuration file.
*/
GALGORITHM,
GRAVCONST,
GTILTMINAXIS,
GTILTPA,
MASSUNCERT,
VSEARCHRAD_PCTGSD,
FSEARCHRAD_PCTGSD,
PXPERDEG,
// global objs are supplied at all resolutions as the starting point.
// This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls
OBJASINPUT,
// The following are options to subvert normal pipeline operations or to configure pipeline for
// other missions
// gzip the obj files to save space
DOGZIP,
// global objs are supplied at all resolutions as the starting point.
// This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls
OBJASINPUT,
// specify the queue to use in the GRID ENGINE
GRIDQUEUE,
// gzip the obj files to save space
DOGZIP,
// default mode for local product creation is to parallelize DistributedGravity
// for each tile. Then processing for each job is done in local mode.
// set this flag to 1 to submit DistributedGravity for each tile sequentially,
// and have each job spawn to the grid engine
DISTGRAVITY_USEGRID,
// specify the queue to use in the GRID ENGINE
GRIDQUEUE,
// override grid engine mode and use local parallel mode with the specified number of cores
LOCALPARALLEL,
// default mode for local product creation is to parallelize DistributedGravity
// for each tile. Then processing for each job is done in local mode.
// set this flag to 1 to submit DistributedGravity for each tile sequentially,
// and have each job spawn to the grid engine
DISTGRAVITY_USEGRID,
// when creating local gravity products skip creation of gravity files that already exist
USEOLDGRAV,
// override grid engine mode and use local parallel mode with the specified number of cores
LOCALPARALLEL,
// override ancillary fits table default setting (binary). Set to ASCII instead
ANCITXTTABLE,
// when creating local gravity products skip creation of gravity files that already exist
USEOLDGRAV,
// contains pointer to fits header config file (optional)
FITSCONFIGFILE,
// override ancillary fits table default setting (binary). Set to ASCII instead
ANCITXTTABLE,
// contains pointer to OBJ comments header file (optional). Will only
// add commentes if ADDOBJCOMMENTS flag is set.
OBJCOMHEADER,
// contains pointer to fits header config file (optional)
FITSCONFIGFILE,
// identifies whether there are poisson reconstruct data products to include in the webpage report
LOCALPOISSON, GLOBALPOISSON;
// contains pointer to OBJ comments header file (optional). Will only
// add commentes if ADDOBJCOMMENTS flag is set.
OBJCOMHEADER,
/*
* The following enumerations are required to exist in the altwg pipeline config file.
*/
public static final EnumSet<AltPipelnEnum> reqTags = EnumSet.range(DOGLOBALSHAPE, VERSION);
// identifies whether there are poisson reconstruct data products to include in the webpage report
LOCALPOISSON,
GLOBALPOISSON;
/*
* The following enumerations do not have to be present in the config file. But, if they are not
* then the pipeline should use the default values associated with the enums.
*/
public static final EnumSet<AltPipelnEnum> overrideTags = EnumSet.range(SMDENSITY, GTILTPA);
/*
* The following enumerations are required to exist in the altwg pipeline config file.
*/
public static final EnumSet<AltPipelnEnum> reqTags = EnumSet.range(DOGLOBALSHAPE, VERSION);
public static String mapToString(Map<AltPipelnEnum, String> pipeConfig) {
StringBuilder sb = new StringBuilder();
for (Map.Entry<AltPipelnEnum, String> entry : pipeConfig.entrySet()) {
sb.append(String.format("%s:%s\n", entry.getKey().toString(), entry.getValue()));
/*
* The following enumerations do not have to be present in the config file. But, if they are not
* then the pipeline should use the default values associated with the enums.
*/
public static final EnumSet<AltPipelnEnum> overrideTags = EnumSet.range(SMDENSITY, GTILTPA);
public static String mapToString(Map<AltPipelnEnum, String> pipeConfig) {
StringBuilder sb = new StringBuilder();
for (Map.Entry<AltPipelnEnum, String> entry : pipeConfig.entrySet()) {
sb.append(String.format("%s:%s\n", entry.getKey().toString(), entry.getValue()));
}
return sb.toString();
}
return sb.toString();
}
public static AltPipelnEnum fromString(String textString) {
for (AltPipelnEnum anciType : AltPipelnEnum.values()) {
if (anciType.toString().equals(textString))
return anciType;
public static AltPipelnEnum fromString(String textString) {
for (AltPipelnEnum anciType : AltPipelnEnum.values()) {
if (anciType.toString().equals(textString)) return anciType;
}
return null;
}
return null;
}
/**
* Convenience method for evaluating configuration parameter where the value indicates whether or
* not the parameter is true or false via an integer value. Assumes 0 or less = false, 1 or more =
* true. If the key does not exist then return false.
*
* @param pipeConfig
* @param key
* @return
*/
public static boolean isTrue(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
boolean returnFlag = false;
int parsedVal = 0;
if (pipeConfig.containsKey(key)) {
try {
parsedVal = Integer.valueOf(pipeConfig.get(key));
} catch (NumberFormatException e) {
System.err.println("ERROR! Could not parse integer value for pipeConfig line:");
System.err.println(key.toString() + "," + pipeConfig.get(key));
System.err.println("Stopping with error!");
System.exit(1);
}
if (parsedVal > 0) {
returnFlag = true;
}
/**
* Convenience method for evaluating configuration parameter where the value indicates whether or
* not the parameter is true or false via an integer value. Assumes 0 or less = false, 1 or more =
* true. If the key does not exist then return false.
*
* @param pipeConfig
* @param key
* @return
*/
public static boolean isTrue(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
boolean returnFlag = false;
int parsedVal = 0;
if (pipeConfig.containsKey(key)) {
try {
parsedVal = Integer.valueOf(pipeConfig.get(key));
} catch (NumberFormatException e) {
System.err.println("ERROR! Could not parse integer value for pipeConfig line:");
System.err.println(key.toString() + "," + pipeConfig.get(key));
System.err.println("Stopping with error!");
System.exit(1);
}
if (parsedVal > 0) {
returnFlag = true;
}
}
return returnFlag;
}
return returnFlag;
}
/**
* Checks to see whether key exists. If so then return value mapped to key. Otherwise return empty
* string.
*
* @param pipeConfig
* @param key
* @return
*/
public static String checkAndGet(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
String value = "";
if (pipeConfig.containsKey(key)) {
value = pipeConfig.get(key);
/**
* Checks to see whether key exists. If so then return value mapped to key. Otherwise return empty
* string.
*
* @param pipeConfig
* @param key
* @return
*/
public static String checkAndGet(Map<AltPipelnEnum, String> pipeConfig, AltPipelnEnum key) {
String value = "";
if (pipeConfig.containsKey(key)) {
value = pipeConfig.get(key);
}
return value;
}
return value;
}
/*
* Some enums will have a default value, e.g. the ones in the overrideTags EnumSet. It is easier
* to keep them as string values then convert them to other primitives as needed. Sometimes other
* executables will be called w/ the default values, so it is better to keep them as strings to
* avoid double conversion.
*/
public static String getDefault(AltPipelnEnum thisEnum) {
/*
* Some enums will have a default value, e.g. the ones in the overrideTags EnumSet. It is easier
* to keep them as string values then convert them to other primitives as needed. Sometimes other
* executables will be called w/ the default values, so it is better to keep them as strings to
* avoid double conversion.
*/
public static String getDefault(AltPipelnEnum thisEnum) {
if (overrideTags.contains(thisEnum)) {
if (overrideTags.contains(thisEnum)) {
switch (thisEnum) {
switch (thisEnum) {
// shape model density and rotation rate must now be explicitly defined in the configuration
// file!
// case SMDENSITY:
// return "1.186";
//
// case SMRRATE:
// return "0.00040626";
// shape model density and rotation rate must now be explicitly defined in the configuration
// file!
// case SMDENSITY:
// return "1.186";
//
// case SMRRATE:
// return "0.00040626";
case GALGORITHM:
return "werner";
case GALGORITHM:
return "werner";
case GRAVCONST:
return "6.67408e-11";
case GRAVCONST:
return "6.67408e-11";
case LTILTMAJAXIS:
return "0.0125";
case LTILTMAJAXIS:
return "0.0125";
case GTILTMINAXIS:
return "0.0125";
case GTILTMINAXIS:
return "0.0125";
case GTILTPA:
case LTILTPA:
return "0.0";
case GTILTPA:
case LTILTPA:
return "0.0";
case MASSUNCERT:
return "0.01";
case MASSUNCERT:
return "0.01";
case VSEARCHRAD_PCTGSD:
return "0.25";
case VSEARCHRAD_PCTGSD:
return "0.25";
case FSEARCHRAD_PCTGSD:
return "0.5";
case FSEARCHRAD_PCTGSD:
return "0.5";
default:
return "NA";
}
default:
return "NA";
}
}
return "NA";
}
return "NA";
};
;
}

View File

@@ -30,53 +30,49 @@ import terrasaur.enums.FitsHeaderType;
public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
public AltwgAnciGlobal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCIGLOBALALTWG);
}
// methods below override the concrete methods in AnciTableFits abstract class or
// are specific to this class
/**
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
* Contains OREX-SPOC specific keywords.
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public AltwgAnciGlobal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCIGLOBALALTWG);
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
// methods below override the concrete methods in AnciTableFits abstract class or
// are specific to this class
}
/**
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
* Contains OREX-SPOC specific keywords.
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
}

View File

@@ -30,67 +30,66 @@ import terrasaur.enums.FitsHeaderType;
public class AltwgAnciGlobalFacetRelation extends AnciTableFits implements AnciFitsHeader {
public AltwgAnciGlobalFacetRelation(FitsHdr fitsHeader) {
public AltwgAnciGlobalFacetRelation(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCIG_FACETRELATION_ALTWG);
}
// methods below override the concrete methods in AnciTableFits abstract class or
// are specific to this class
/**
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
* product keywords
*
* @param comment
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
super(fitsHeader, FitsHeaderType.ANCIG_FACETRELATION_ALTWG);
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJINDX));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDX));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDXI));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
// methods below override the concrete methods in AnciTableFits abstract class or
// are specific to this class
return headers;
}
/**
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
* product keywords
*
* @param comment
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJINDX));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDX));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDINDXI));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
return headers;
}
}

View File

@@ -30,53 +30,49 @@ import terrasaur.enums.FitsHeaderType;
public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
public AltwgAnciLocal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCILOCALALTWG);
}
// methods below override the concrete methods in AnciTableFits abstract class or
// are specific to this class
/**
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
* Contains OREX-SPOC specific keywords.
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public AltwgAnciLocal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCILOCALALTWG);
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
// methods below override the concrete methods in AnciTableFits abstract class or
// are specific to this class
}
/**
* Create fits header as a list of HeaderCard. List contains the keywords in the order of
* appearance in the ALTWG fits header. Overrides default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
* Contains OREX-SPOC specific keywords.
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
}

View File

@@ -33,77 +33,73 @@ import terrasaur.utils.DTMHeader;
* Contains methods for building fits header corresponding to ALTWG Global DTM. Methods that are
* specific to the ALTWG Global DTM fits header are contained here. Default methods contained in
* DTMFits class.
*
*
* @author espirrc1
*
*/
public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
public AltwgGlobalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMGLOBALALTWG);
}
/**
* Fits header block containing observation or ID related information. Includes keywords specific
* to OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public AltwgGlobalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMGLOBALALTWG);
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
/**
* Fits header block containing observation or ID related information. Includes keywords specific
* to OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
/**
* return Fits header block that contains information about the fits header itself. Custom to
* OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
return headers;
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
/**
* return Fits header block that contains information about the fits header itself. Custom to
* OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
/**
* Added GSDI - specific to OREX-SPOC.
*/
@Override
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
return headers;
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
return headers;
}
/**
* Added GSDI - specific to OREX-SPOC.
*/
@Override
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
return headers;
}
}

View File

@@ -33,122 +33,119 @@ import terrasaur.utils.DTMHeader;
* Contains methods for building fits header corresponding to ALTWG local DTM. Methods that are
* specific to the ALTWG Local DTM fits header are contained here. Default methods are contained in
* DTMFits class.
*
*
* @author espirrc1
*
*/
public class AltwgLocalDTM extends DTMFits implements DTMHeader {
public AltwgLocalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMLOCALALTWG);
}
/**
* Fits header block containing observation or ID related information. Includes keywords specific
* to OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
@Override
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public AltwgLocalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMLOCALALTWG);
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_2));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
/**
* Fits header block containing observation or ID related information. Includes keywords specific
* to OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
/**
* Include corner points, center vector and ux,uy,uz describing local plane
*/
@Override
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
return headers;
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
@Override
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
headers.addAll(getCornerCards());
headers.addAll(getCenterVec());
headers.addAll(getUX());
headers.addAll(getUY());
headers.addAll(getUZ());
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
return headers;
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DQUAL_2));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
/**
* return Fits header block that contains information about the fits header itself. Custom to
* OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
return headers;
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
/**
* Include corner points, center vector and ux,uy,uz describing local plane
*/
@Override
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
/**
* Added GSDI - specific to OREX-SPOC.
*/
@Override
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
headers.addAll(getCornerCards());
headers.addAll(getCenterVec());
headers.addAll(getUX());
headers.addAll(getUY());
headers.addAll(getUZ());
return headers;
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
return headers;
}
/**
* return Fits header block that contains information about the fits header itself. Custom to
* OREX-SPOC
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
* Added GSDI - specific to OREX-SPOC.
*/
@Override
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
return headers;
}
}

View File

@@ -23,12 +23,10 @@
package terrasaur.fits;
import java.util.List;
import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException;
public interface AnciFitsHeader {
public List<HeaderCard> createFitsHeader() throws HeaderCardException;
public List<HeaderCard> createFitsHeader() throws HeaderCardException;
}

View File

@@ -32,282 +32,272 @@ import terrasaur.enums.FitsHeaderType;
* Abstract generic class with concrete methods and attributes for creating a FITS table with
* generalized fits header. Specific implementations can be written to create custom fits headers as
* needed.
*
*
* @author espirrc1
*
*/
public abstract class AnciTableFits {
public final String COMMENT = "COMMENT";
FitsHdr fitsHdr;
public final FitsHeaderType fitsHeaderType;
public final String COMMENT = "COMMENT";
FitsHdr fitsHdr;
public final FitsHeaderType fitsHeaderType;
public AnciTableFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
this.fitsHdr = fitsHdr;
this.fitsHeaderType = fitsHeaderType;
}
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
return headers;
}
/**
* return Fits header block that contains information about the fits header itself.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
* Fits header block containing information about the mission.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
* Fits header block containing observation or ID related information.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
* Fits header block containing information about the source data used to create the map.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
return headers;
}
/**
* Fits header block containing information about the software processing done to generate the
* product.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public AnciTableFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
this.fitsHdr = fitsHdr;
this.fitsHeaderType = fitsHeaderType;
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
public List<HeaderCard> createFitsHeader() throws HeaderCardException {
headers.addAll(getCornerCards());
List<HeaderCard> headers = new ArrayList<HeaderCard>();
// add CNTR_V_X,Y,Z
headers.addAll(getCenterVec());
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("shape data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
// add UX_X,Y,Z
headers.addAll(getUX());
// add UY_X,Y,Z
headers.addAll(getUY());
// add UZ_X,Y,Z
headers.addAll(getUZ());
return headers;
}
/**
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
* product keywords
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
return headers;
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
/**
* return Fits header block that contains information about the fits header itself.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
/**
* Return headercards associated with the upper/lower left/right corners of the image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCornerCards() throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
String fmtS = "%18.13f";
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
/**
* Fits header block containing information about the mission.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
/**
* Return headercards for vector to center of image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCenterVec() throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
/**
* Fits header block containing observation or ID related information.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
public List<HeaderCard> getUX() throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
/**
* Fits header block containing information about the source data used to create the map.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
return headers;
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
}
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
public List<HeaderCard> getUY() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSDI));
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers;
}
return headers;
/**
* Fits header block containing information about the software processing done to generate the
* product.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
public List<HeaderCard> getUZ() throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
return headers;
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
headers.addAll(getCornerCards());
// add CNTR_V_X,Y,Z
headers.addAll(getCenterVec());
// add UX_X,Y,Z
headers.addAll(getUX());
// add UY_X,Y,Z
headers.addAll(getUY());
// add UZ_X,Y,Z
headers.addAll(getUZ());
return headers;
}
/**
* Return the HeaderCards associated with a specific product. By default we use the ALTWG specific
* product keywords
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DSIG_DEF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TILT_PA));
return headers;
}
/**
* Return headercards associated with the upper/lower left/right corners of the image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCornerCards() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
String fmtS = "%18.13f";
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
return headers;
}
/**
* Return headercards for vector to center of image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCenterVec() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
return headers;
}
public List<HeaderCard> getUX() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
return headers;
}
public List<HeaderCard> getUY() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers;
}
public List<HeaderCard> getUZ() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
return headers;
}
}

View File

@@ -32,342 +32,332 @@ import terrasaur.enums.FitsHeaderType;
* Abstract generic class with concrete methods and attributes for creating a FITS DTM cube with a
* generalized fits header. Specific implementations can be written to create custom fits headers as
* needed.
*
*
* @author espirrc1
*
*/
public abstract class DTMFits {
public final String COMMENT = "COMMENT";
final FitsHdr fitsHdr;
private FitsData fitsData;
private boolean dataContained = false;
public final FitsHeaderType fitsHeaderType;
public final String COMMENT = "COMMENT";
final FitsHdr fitsHdr;
private FitsData fitsData;
private boolean dataContained = false;
public final FitsHeaderType fitsHeaderType;
public DTMFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
this.fitsHdr = fitsHdr;
this.fitsHeaderType = fitsHeaderType;
}
public void setData(FitsData fitsData) {
this.fitsData = fitsData;
dataContained = true;
}
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getPlaneInfo("plane information", planeList));
headers.addAll(getSpecificInfo("product specific"));
// end keyword
headers.add(getEnd());
return headers;
}
/**
* return Fits header block that contains information about the fits header itself. No string
* passed, so no comment in header.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getHeaderInfo() throws HeaderCardException {
return getHeaderInfo("");
}
/**
* return Fits header block that contains information about the fits header itself. This is a
* custom section and so is left empty here. It can be defined in the concrete classes that extend
* this class.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
return headers;
}
/**
* Fits header block containing information about the mission.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
* Fits header block containing observation or ID related information.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
* Fits header block containing information about the source data used to create the map.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
return headers;
}
/**
* Fits header block containing information about the software processing done to generate the
* product.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
/**
* Creates header block containing spatial information for the DTM, e.g. corner locations, vector
* to center, Ux, Uy, Uz.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public DTMFits(FitsHdr fitsHdr, FitsHeaderType fitsHeaderType) {
this.fitsHdr = fitsHdr;
this.fitsHeaderType = fitsHeaderType;
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
headers.addAll(getCornerCards());
// remove these keywords. They are specific to local and MLNs
// headers.addAll(getCenterVec());
// headers.addAll(getUX());
// headers.addAll(getUY());
// headers.addAll(getUZ());
return headers;
}
/**
* Return the HeaderCards describing each DTM plane. Used to build the portion of the fits header
* that contains information about the planes in the DTM cube. Checks to see that all data planes
* are described by comparing size of planeList against length of fits data.
*
* @param comment
* @param planeList
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList)
throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.addAll(planeList);
if (!dataContained) {
String errMesg = "ERROR! Cannot return keywords describing the DTM cube without "
+ "having the actual data!";
throw new RuntimeException(errMesg);
public void setData(FitsData fitsData) {
this.fitsData = fitsData;
dataContained = true;
}
// check if planeList describes all the planes in data, throw runtime exception if not.
if (planeList.size() != fitsData.getData().length) {
System.out.println("Error: plane List has " + planeList.size() + " planes but datacube has "
+ fitsData.getData().length + " planes");
for (HeaderCard thisPlane : planeList) {
System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue());
}
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube "
+ " has " + fitsData.getData().length + " planes";
throw new RuntimeException(errMesg);
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.addAll(getHeaderInfo("header information"));
headers.addAll(getMissionInfo("mission information"));
headers.addAll(getIDInfo("identification info"));
headers.addAll(getMapDataSrc("data source"));
headers.addAll(getProcInfo("processing information"));
headers.addAll(getMapInfo("map specific information"));
headers.addAll(getSpatialInfo("summary spatial information"));
headers.addAll(getPlaneInfo("plane information", planeList));
headers.addAll(getSpecificInfo("product specific"));
// end keyword
headers.add(getEnd());
return headers;
}
return headers;
}
/**
* Return the HeaderCards associated with a specific product.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
/**
* return Fits header block that contains information about the fits header itself. No string
* passed, so no comment in header.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getHeaderInfo() throws HeaderCardException {
return getHeaderInfo("");
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
/**
* return Fits header block that contains information about the fits header itself. This is a
* custom section and so is left empty here. It can be defined in the concrete classes that extend
* this class.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
return headers;
}
/**
* Return headercards associated with the upper/lower left/right corners of the image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCornerCards() throws HeaderCardException {
/**
* Fits header block containing information about the mission.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMissionInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
String fmtS = "%18.13f";
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
return headers;
}
/**
* Return headercards for vector to center of image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCenterVec() throws HeaderCardException {
/**
* Fits header block containing observation or ID related information.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getIDInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
return headers;
}
public List<HeaderCard> getUX() throws HeaderCardException {
/**
* Fits header block containing information about the source data used to create the map.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
return headers;
}
}
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
public List<HeaderCard> getUY() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers;
}
return headers;
/**
* Fits header block containing information about the software processing done to generate the
* product.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
public List<HeaderCard> getUZ() throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
/**
* Creates header block containing spatial information for the DTM, e.g. corner locations, vector
* to center, Ux, Uy, Uz.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
return headers;
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
headers.addAll(getCornerCards());
public HeaderCard getEnd() throws HeaderCardException {
return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment());
}
// remove these keywords. They are specific to local and MLNs
// headers.addAll(getCenterVec());
// headers.addAll(getUX());
// headers.addAll(getUY());
// headers.addAll(getUZ());
return headers;
}
/**
* Return the HeaderCards describing each DTM plane. Used to build the portion of the fits header
* that contains information about the planes in the DTM cube. Checks to see that all data planes
* are described by comparing size of planeList against length of fits data.
*
* @param comment
* @param planeList
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.addAll(planeList);
if (!dataContained) {
String errMesg =
"ERROR! Cannot return keywords describing the DTM cube without " + "having the actual data!";
throw new RuntimeException(errMesg);
}
// check if planeList describes all the planes in data, throw runtime exception if not.
if (planeList.size() != fitsData.getData().length) {
System.out.println("Error: plane List has " + planeList.size() + " planes but datacube has "
+ fitsData.getData().length + " planes");
for (HeaderCard thisPlane : planeList) {
System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue());
}
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube " + " has "
+ fitsData.getData().length + " planes";
throw new RuntimeException(errMesg);
}
return headers;
}
/**
* Return the HeaderCards associated with a specific product.
*
* @param comment
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.PXPERDEG));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DENSITY));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.ROT_RATE));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.REF_POT));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MAJ));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_MIN));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.TILT_PA));
return headers;
}
/**
* Return headercards associated with the upper/lower left/right corners of the image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCornerCards() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
String fmtS = "%18.13f";
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG, fmtS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT, fmtS));
return headers;
}
/**
* Return headercards for vector to center of image.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getCenterVec() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CNTR_V_Z));
return headers;
}
public List<HeaderCard> getUX() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
return headers;
}
public List<HeaderCard> getUY() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers;
}
public List<HeaderCard> getUZ() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_X));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Y));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
return headers;
}
public HeaderCard getEnd() throws HeaderCardException {
return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment());
}
}

View File

@@ -27,186 +27,184 @@ import terrasaur.enums.SrcProductType;
public class FitsData {
private final double[][][] data;
private final double[] V;
private final double[] ux;
private final double[] uy;
private final double[] uz;
private final double scale;
private final double gsd;
private final boolean hasV;
private final boolean hasUnitv;
private final boolean hasGsd;
private final boolean isGlobal;
private final boolean hasAltType;
private final AltwgDataType altProd;
private final String dataSource;
private FitsData(FitsDataBuilder b) {
this.data = b.data;
this.V = b.V;
this.ux = b.ux;
this.uy = b.uy;
this.uz = b.uz;
this.scale = b.scale;
this.gsd = b.gsd;
this.hasV = b.hasV;
this.hasUnitv = b.hasUnitv;
this.hasGsd = b.hasGsd;
this.hasAltType = b.hasAltType;
this.isGlobal = b.isGlobal;
this.altProd = b.altProd;
this.dataSource = b.dataSource;
}
public AltwgDataType getAltProdType() {
return this.altProd;
}
public String getSrcProdType() {
return this.dataSource;
}
public double[][][] getData() {
return this.data;
}
public boolean hasV() {
return this.hasV;
}
public double[] getV() {
return this.V;
}
public boolean hasUnitv() {
return this.hasUnitv;
}
public double[] getUnit(UnitDir udir) {
switch (udir) {
case UX:
return this.ux;
case UY:
return this.uy;
case UZ:
return this.uz;
default:
throw new RuntimeException();
}
}
public double getScale() {
return this.scale;
}
public boolean hasGsd() {
return this.hasGsd;
}
public boolean hasAltType() {
return this.hasAltType;
}
public boolean isGlobal() {
return this.isGlobal;
}
public double getGSD() {
if (this.hasGsd) {
return this.gsd;
} else {
String errMesg = "ERROR! fitsData does not have gsd!";
throw new RuntimeException(errMesg);
}
}
public static class FitsDataBuilder {
private final double[][][] data;
private double[] V = null;
private double[] ux = null;
private double[] uy = null;
private double[] uz = null;
private boolean hasV = false;
private boolean hasUnitv = false;
private boolean hasGsd = false;
private boolean isGlobal = false;
private boolean hasAltType = false;
private double scale = Double.NaN;
private double gsd = Double.NaN;
private AltwgDataType altProd = null;
private String dataSource = SrcProductType.UNKNOWN.toString();
private final double[] V;
private final double[] ux;
private final double[] uy;
private final double[] uz;
private final double scale;
private final double gsd;
private final boolean hasV;
private final boolean hasUnitv;
private final boolean hasGsd;
private final boolean isGlobal;
private final boolean hasAltType;
private final AltwgDataType altProd;
private final String dataSource;
/**
* Constructor. isGlobal used to fill out fits keyword describing whether data is local or
* global. May also be used for fits naming convention.
*
* @param data
* @param isGlobal
*/
public FitsDataBuilder(double[][][] data, boolean isGlobal) {
this.data = data;
this.isGlobal = isGlobal;
private FitsData(FitsDataBuilder b) {
this.data = b.data;
this.V = b.V;
this.ux = b.ux;
this.uy = b.uy;
this.uz = b.uz;
this.scale = b.scale;
this.gsd = b.gsd;
this.hasV = b.hasV;
this.hasUnitv = b.hasUnitv;
this.hasGsd = b.hasGsd;
this.hasAltType = b.hasAltType;
this.isGlobal = b.isGlobal;
this.altProd = b.altProd;
this.dataSource = b.dataSource;
}
public FitsDataBuilder setAltProdType(AltwgDataType altProd) {
this.altProd = altProd;
this.hasAltType = true;
return this;
public AltwgDataType getAltProdType() {
return this.altProd;
}
public FitsDataBuilder setDataSource(String dataSource) {
this.dataSource = dataSource;
return this;
public String getSrcProdType() {
return this.dataSource;
}
public FitsDataBuilder setV(double[] V) {
this.V = V;
this.hasV = true;
return this;
public double[][][] getData() {
return this.data;
}
public FitsDataBuilder setU(double[] uvec, UnitDir udir) {
switch (udir) {
case UX:
this.ux = uvec;
this.hasUnitv = true;
break;
case UY:
this.uy = uvec;
this.hasUnitv = true;
break;
case UZ:
this.uz = uvec;
this.hasUnitv = true;
break;
default:
throw new RuntimeException();
}
return this;
public boolean hasV() {
return this.hasV;
}
public FitsDataBuilder setScale(double scale) {
this.scale = scale;
return this;
public double[] getV() {
return this.V;
}
public FitsDataBuilder setGSD(double gsd) {
this.gsd = gsd;
this.hasGsd = true;
return this;
public boolean hasUnitv() {
return this.hasUnitv;
}
public FitsData build() {
return new FitsData(this);
public double[] getUnit(UnitDir udir) {
switch (udir) {
case UX:
return this.ux;
case UY:
return this.uy;
case UZ:
return this.uz;
default:
throw new RuntimeException();
}
}
public double getScale() {
return this.scale;
}
public boolean hasGsd() {
return this.hasGsd;
}
public boolean hasAltType() {
return this.hasAltType;
}
public boolean isGlobal() {
return this.isGlobal;
}
public double getGSD() {
if (this.hasGsd) {
return this.gsd;
} else {
String errMesg = "ERROR! fitsData does not have gsd!";
throw new RuntimeException(errMesg);
}
}
public static class FitsDataBuilder {
private final double[][][] data;
private double[] V = null;
private double[] ux = null;
private double[] uy = null;
private double[] uz = null;
private boolean hasV = false;
private boolean hasUnitv = false;
private boolean hasGsd = false;
private boolean isGlobal = false;
private boolean hasAltType = false;
private double scale = Double.NaN;
private double gsd = Double.NaN;
private AltwgDataType altProd = null;
private String dataSource = SrcProductType.UNKNOWN.toString();
/**
* Constructor. isGlobal used to fill out fits keyword describing whether data is local or
* global. May also be used for fits naming convention.
*
* @param data
* @param isGlobal
*/
public FitsDataBuilder(double[][][] data, boolean isGlobal) {
this.data = data;
this.isGlobal = isGlobal;
}
public FitsDataBuilder setAltProdType(AltwgDataType altProd) {
this.altProd = altProd;
this.hasAltType = true;
return this;
}
public FitsDataBuilder setDataSource(String dataSource) {
this.dataSource = dataSource;
return this;
}
public FitsDataBuilder setV(double[] V) {
this.V = V;
this.hasV = true;
return this;
}
public FitsDataBuilder setU(double[] uvec, UnitDir udir) {
switch (udir) {
case UX:
this.ux = uvec;
this.hasUnitv = true;
break;
case UY:
this.uy = uvec;
this.hasUnitv = true;
break;
case UZ:
this.uz = uvec;
this.hasUnitv = true;
break;
default:
throw new RuntimeException();
}
return this;
}
public FitsDataBuilder setScale(double scale) {
this.scale = scale;
return this;
}
public FitsDataBuilder setGSD(double gsd) {
this.gsd = gsd;
this.hasGsd = true;
return this;
}
public FitsData build() {
return new FitsData(this);
}
}
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -33,203 +33,193 @@ import terrasaur.utils.DTMHeader;
* Factory class that returns List<HeaderCard> where the HeaderCard objects are in the correct order
* for writing to a fits header. Also contains methods for creating List<HeaderCard> for different
* sections of a fits header
*
*
* @author espirrc1
*
*/
public class FitsHeaderFactory {
private static final String PLANE = "PLANE";
private static final String COMMENT = "COMMENT";
private static final String PLANE = "PLANE";
private static final String COMMENT = "COMMENT";
public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
switch (headerType) {
case NFTMLN:
return new NFTmln(fitsHdr);
switch (headerType) {
case DTMLOCALALTWG:
return new AltwgLocalDTM(fitsHdr);
case NFTMLN:
return new NFTmln(fitsHdr);
case DTMGLOBALALTWG:
return new AltwgGlobalDTM(fitsHdr);
case DTMLOCALALTWG:
return new AltwgLocalDTM(fitsHdr);
case DTMGLOBALGENERIC:
return new GenericGlobalDTM(fitsHdr);
case DTMGLOBALALTWG:
return new AltwgGlobalDTM(fitsHdr);
case DTMLOCALGENERIC:
return new GenericLocalDTM(fitsHdr);
case DTMGLOBALGENERIC:
return new GenericGlobalDTM(fitsHdr);
case DTMLOCALGENERIC:
return new GenericLocalDTM(fitsHdr);
default:
return null;
default:
return null;
}
}
}
public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
switch (headerType) {
case ANCIGLOBALGENERIC:
return new GenericAnciGlobal(fitsHdr);
switch (headerType) {
case ANCIGLOBALGENERIC:
return new GenericAnciGlobal(fitsHdr);
case ANCILOCALGENERIC:
return new GenericAnciLocal(fitsHdr);
case ANCILOCALGENERIC:
return new GenericAnciLocal(fitsHdr);
case ANCIGLOBALALTWG:
return new AltwgAnciGlobal(fitsHdr);
case ANCIGLOBALALTWG:
return new AltwgAnciGlobal(fitsHdr);
case ANCIG_FACETRELATION_ALTWG:
return new AltwgAnciGlobalFacetRelation(fitsHdr);
case ANCIG_FACETRELATION_ALTWG:
return new AltwgAnciGlobalFacetRelation(fitsHdr);
case ANCILOCALALTWG:
return new AltwgAnciLocal(fitsHdr);
case ANCILOCALALTWG:
return new AltwgAnciLocal(fitsHdr);
default:
return null;
default:
return null;
}
}
}
/**
* Fits Header block that contains information about the fits header itself. Ex. Header version
* number.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getHeaderInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "header information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
/**
* Fits Header block that contains information about the fits header itself. Ex. Header version
* number.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getHeaderInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "header information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
* Fits header block that contains information about the mission Ex. MISSION name, HOST name,
* Target name.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getMissionInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "mission information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
* Fits header block that contains ID information, i.e. information that would uniquely identify
* the data product.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getIdInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "identification info", false));
// check latest Map Format SIS revision to see if SPOC handles these keywords
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
* Fits header block that contains information about the source shape model used to create the
* fits file.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getShapeSourceInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "shape data source", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
public static List<HeaderCard> getProcInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "processing information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "map specific information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
return headers;
// check latest Map Format SIS revision to see if SPOC handles these keywords
// MAP_PROJ*
// GSD*
// GSDI*
}
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr,
FitsHeaderType fitsHeaderType) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
// this section common to all fitsHeaderTypes
headers.add(new HeaderCard(COMMENT, "summary spatial information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLAT));
switch (fitsHeaderType) {
case ANCILOCALGENERIC:
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT));
break;
default:
// default does nothing because switch only handles specific cases.
break;
return headers;
}
return headers;
}
/**
* Fits header block that contains information about the mission Ex. MISSION name, HOST name,
* Target name.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getMissionInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "mission information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MISSION));
headers.add(fitsHdr.getHeaderCard(HeaderTag.HOSTNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.TARGET));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
* Fits header block that contains ID information, i.e. information that would uniquely identify
* the data product.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getIdInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "identification info", false));
// check latest Map Format SIS revision to see if SPOC handles these keywords
headers.add(fitsHdr.getHeaderCard(HeaderTag.SPOC_ID));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPAREA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SDPDESC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
* Fits header block that contains information about the source shape model used to create the
* fits file.
*
* @param fitsHdr
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> getShapeSourceInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "shape data source", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
public static List<HeaderCard> getProcInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "processing information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATEPRD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "map specific information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_NAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_VER));
headers.add(fitsHdr.getHeaderCard(HeaderTag.MAP_TYPE));
return headers;
// check latest Map Format SIS revision to see if SPOC handles these keywords
// MAP_PROJ*
// GSD*
// GSDI*
}
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr, FitsHeaderType fitsHeaderType)
throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
// this section common to all fitsHeaderTypes
headers.add(new HeaderCard(COMMENT, "summary spatial information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CLAT));
switch (fitsHeaderType) {
case ANCILOCALGENERIC:
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LLCLAT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.URCLAT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.LRCLAT));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLNG));
headers.add(fitsHdr.getHeaderCard(HeaderTag.ULCLAT));
break;
default:
// default does nothing because switch only handles specific cases.
break;
}
return headers;
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -24,38 +24,37 @@ package terrasaur.fits;
/**
* Container class for storing the value and comment associated with a given fits keyword
*
*
* @author espirrc1
*
*/
public class FitsValCom {
private String value;
private String comment;
private String value;
private String comment;
public FitsValCom(String value, String comment) {
this.value = value;
this.comment = comment;
}
public FitsValCom(String value, String comment) {
this.value = value;
this.comment = comment;
}
public String getV() {
return value;
}
public String getV() {
return value;
}
public String getC() {
return comment;
}
public String getC() {
return comment;
}
public void setV(String newVal) {
this.value = newVal;
}
public void setV(String newVal) {
this.value = newVal;
}
public void setC(String newComment) {
this.comment = newComment;
}
public void setVC(String newVal, String newComment) {
setV(newVal);
setC(newComment);
}
public void setC(String newComment) {
this.comment = newComment;
}
public void setVC(String newVal, String newComment) {
setV(newVal);
setC(newComment);
}
}

View File

@@ -31,38 +31,36 @@ import terrasaur.enums.FitsHeaderType;
/**
* Contains methods for building fits header corresponding to the Generic Ancillary GLobal fits
* header as specified in the Map Formats SIS.
*
*
* See concrete methods and attributes in AnciTableFits unless overridden. Overridden or
* implementation methods specific to this fits type are here.
*
*
* @author espirrc1
*
*/
public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
public GenericAnciGlobal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC);
}
/**
* Build fits header portion that contains the spatial information of the Generic Anci Global
* product. Overrides the default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
public GenericAnciGlobal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC);
}
// the generic Global Anci product ONLY CONTAINS THE CENTER LAT LON!
// per map_format_fits_header_normalization_09212917_V02.xlsx
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
/**
* Build fits header portion that contains the spatial information of the Generic Anci Global
* product. Overrides the default implementation in AnciTableFits.
*/
@Override
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
return headers;
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
// the generic Global Anci product ONLY CONTAINS THE CENTER LAT LON!
// per map_format_fits_header_normalization_09212917_V02.xlsx
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
return headers;
}
}

View File

@@ -27,18 +27,16 @@ import terrasaur.enums.FitsHeaderType;
/**
* Contains methods for building fits header corresponding to the Generic Ancillary Local fits
* header as specified in the Map Formats SIS.
*
*
* See concrete methods and attributes in AnciTableFits unless overridden. Overridden or
* implementation methods specific to this fits type are here.
*
*
* @author espirrc1
*
*/
public class GenericAnciLocal extends AnciTableFits implements AnciFitsHeader {
public GenericAnciLocal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC);
}
public GenericAnciLocal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC);
}
}

View File

@@ -28,14 +28,13 @@ import terrasaur.utils.DTMHeader;
/**
* Contains methods for building generic Global DTM fits header. Generic Global DTM header will
* contain only those keywords that are deemed common to all Global DTM fits files.
*
*
* @author espirrc1
*
*/
public class GenericGlobalDTM extends DTMFits implements DTMHeader {
public GenericGlobalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC);
}
public GenericGlobalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC);
}
}

View File

@@ -28,14 +28,13 @@ import terrasaur.utils.DTMHeader;
/**
* Contains methods for building generic Local DTM Fits Header. DTM header will contain only those
* keywords that are deemed common to all Local DTM fits files.
*
*
* @author espirrc1
*
*/
public class GenericLocalDTM extends DTMFits implements DTMHeader {
public GenericLocalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC);
}
public GenericLocalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC);
}
}

View File

@@ -31,153 +31,177 @@ import java.util.EnumSet;
* enumeration. A few keywords may have overlap. For example, SIGMA is defined here to represent the
* global uncertainty of the data. It is also in PlaneInfo to define an entire image plane
* consisting of sigma values.
*
*
* @author espirrc1
*
*
*/
public enum HeaderTag {
// fits data tags. Included here in case we want to have values or comments
// that are not null. Some of the default values obviously need to be updated when actual fits
// file
// is created.
SIMPLE("T", null), BITPIX(null, null), NAXIS("3", null), NAXIS1(null, null), NAXIS2(null, null),
// fits data tags. Included here in case we want to have values or comments
// that are not null. Some of the default values obviously need to be updated when actual fits
// file
// is created.
SIMPLE("T", null),
BITPIX(null, null),
NAXIS("3", null),
NAXIS1(null, null),
NAXIS2(null, null),
NAXIS3(null, null), EXTEND("T", null), HDRVERS("1.0.0", null), NPRDVERS("1.0.0",
null), MISSION("OSIRIS-REx", "Name of mission"), HOSTNAME("OREX", "PDS ID"),
NAXIS3(null, null),
EXTEND("T", null),
HDRVERS("1.0.0", null),
NPRDVERS("1.0.0", null),
MISSION("OSIRIS-REx", "Name of mission"),
HOSTNAME("OREX", "PDS ID"),
TARGET("101955 BENNU", "Target object"), ORIGIN("OREXSPOC", null),
TARGET("101955 BENNU", "Target object"),
ORIGIN("OREXSPOC", null),
SPOC_ID("SPOCUPLOAD", null), SDPAREA("SPOCUPLOAD", null), SDPDESC("SPOCUPLOAD", null),
SPOC_ID("SPOCUPLOAD", null),
SDPAREA("SPOCUPLOAD", null),
SDPDESC("SPOCUPLOAD", null),
MPHASE("FillMeIn", "Mission phase."), DATASRC("FillMeIn",
"Shape model data source, i.e. 'SPC' or 'OLA'"),
MPHASE("FillMeIn", "Mission phase."),
DATASRC("FillMeIn", "Shape model data source, i.e. 'SPC' or 'OLA'"),
DATASRCF("FillMeIn", "Source shape model data filename"), DATASRCV("FillMeIn",
"Name and version of shape model"), DATASRCD("FillMeIn",
"Creation date of shape model in UTC"), DATASRCS("N/A",
"[m/pix] Shpe model plt scale"), SOFTWARE("FillMeIn",
"Software used to create map data"),
DATASRCF("FillMeIn", "Source shape model data filename"),
DATASRCV("FillMeIn", "Name and version of shape model"),
DATASRCD("FillMeIn", "Creation date of shape model in UTC"),
DATASRCS("N/A", "[m/pix] Shpe model plt scale"),
SOFTWARE("FillMeIn", "Software used to create map data"),
SOFT_VER("FillMeIn", "Version of software used to create map data"),
SOFT_VER("FillMeIn", "Version of software used to create map data"),
DATEPRD("1701-10-09", "Date this product was produced in UTC"),
DATENPRD("1701-10-09", "Date this NFT product was produced in UTC"),
PRODNAME("FillMeIn", "Product filename"),
PRODVERS("1.0.0", "Product version number"),
DATEPRD("1701-10-09", "Date this product was produced in UTC"), DATENPRD("1701-10-09",
"Date this NFT product was produced in UTC"), PRODNAME("FillMeIn",
"Product filename"), PRODVERS("1.0.0", "Product version number"),
MAP_VER("999", "Product version number."),
CREATOR("ALT-pipeline", "Name of software that created this product"),
AUTHOR("Espiritu", "Name of person that compiled this product"),
PROJECTN("Equirectangular", "Simple cylindrical projection"),
CLON("-999", "[deg] longitude at center of image"),
CLAT("-999", "[deg] latitude at center of image"),
MINLON(null, "[deg] minimum longitude of global DTM"),
MAXLON(null, "[deg] maximum longitude of global DTM"),
MINLAT(null, "[deg] minimum latitude of global DTM"),
MAXLAT(null, "[deg] maximum latitude of global DTM"),
MAP_VER("999", "Product version number."), CREATOR("ALT-pipeline",
"Name of software that created this product"), AUTHOR("Espiritu",
"Name of person that compiled this product"), PROJECTN("Equirectangular",
"Simple cylindrical projection"), CLON("-999",
"[deg] longitude at center of image"), CLAT("-999",
"[deg] latitude at center of image"), MINLON(null,
"[deg] minimum longitude of global DTM"), MAXLON(null,
"[deg] maximum longitude of global DTM"), MINLAT(null,
"[deg] minimum latitude of global DTM"), MAXLAT(null,
"[deg] maximum latitude of global DTM"),
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."),
LLCLAT("-999", "[deg]"),
LLCLNG("-999", "[deg]"),
ULCLAT("-999", "[deg]"),
ULCLNG("-999", "[deg]"),
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."), LLCLAT("-999",
"[deg]"), LLCLNG("-999", "[deg]"), ULCLAT("-999", "[deg]"), ULCLNG("-999", "[deg]"),
URCLAT("-999", "[deg]"),
URCLNG("-999", "[deg]"),
LRCLAT("-999", "[deg]"),
LRCLNG("-999", "[deg]"),
URCLAT("-999", "[deg]"), URCLNG("-999", "[deg]"), LRCLAT("-999", "[deg]"), LRCLNG("-999",
"[deg]"),
CNTR_V_X("-999", "[km]"),
CNTR_V_Y("-999", "[km]"),
CNTR_V_Z("-999", "[km]"),
UX_X("-999", "[m]"),
UX_Y("-999", "[m]"),
CNTR_V_X("-999", "[km]"), CNTR_V_Y("-999", "[km]"), CNTR_V_Z("-999", "[km]"), UX_X("-999",
"[m]"), UX_Y("-999", "[m]"),
UX_Z("-999", "[m]"),
UY_X("-999", "[m]"),
UY_Y("-999", "[m]"),
UY_Z("-999", "[m]"),
UZ_X("-999", "[m]"),
UX_Z("-999", "[m]"), UY_X("-999", "[m]"), UY_Y("-999", "[m]"), UY_Z("-999", "[m]"), UZ_X("-999",
"[m]"),
UZ_Y("-999", "/[m]"),
UZ_Z("-999", "[m]"),
GSD("-999", "[mm] grid spacing in units/pixel"),
GSDI("-999", "[unk] Ground sample distance integer"),
SIGMA("-999", "Global uncertainty of the data [m]"),
UZ_Y("-999", "/[m]"), UZ_Z("-999", "[m]"), GSD("-999", "[mm] grid spacing in units/pixel"), GSDI(
"-999",
"[unk] Ground sample distance integer"), SIGMA("-999", "Global uncertainty of the data [m]"),
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"),
DQUAL_1("-999", "Data quality metric; incidence directions"),
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"), DQUAL_1("-999",
"Data quality metric; incidence directions"),
DQUAL_2("-999", "Data quality metric; emission directions"),
DSIG_DEF("UNK", "Defines uncertainty metric in ancillary file"),
END(null, null),
DQUAL_2("-999", "Data quality metric; emission directions"), DSIG_DEF("UNK",
"Defines uncertainty metric in ancillary file"), END(null, null),
// additional fits tags describing gravity derived values
DENSITY("-999", "[kgm^-3] density of body"),
ROT_RATE("-999", "[rad/sec] rotation rate of body"),
REF_POT("-999", "[J/kg] reference potential of body"),
// additional fits tags describing gravity derived values
DENSITY("-999", "[kgm^-3] density of body"), ROT_RATE("-999",
"[rad/sec] rotation rate of body"), REF_POT("-999", "[J/kg] reference potential of body"),
// additional fits tags describing tilt derived values
TILT_RAD("-999", "[m]"),
TILT_MAJ("-999", "[m] semi-major axis of ellipse for tilt calcs"),
TILT_MIN("-999", "[m] semi-minor axis of ellipse for tilt calcs"),
TILT_PA("-999", "[deg] position angle of ellipse for tilt calcs"),
// additional fits tags describing tilt derived values
TILT_RAD("-999", "[m]"), TILT_MAJ("-999",
"[m] semi-major axis of ellipse for tilt calcs"), TILT_MIN("-999",
"[m] semi-minor axis of ellipse for tilt calcs"), TILT_PA("-999",
"[deg] position angle of ellipse for tilt calcs"),
// Additional fits tags specific to ancillary fits files
MAP_NAME("FillMeIn", "Map data type"),
MAP_TYPE("FillMeIn", "Defines whether this is a global or local map"),
OBJ_FILE("TEMPLATEENTRY", null),
// Additional fits tags specific to ancillary fits files
MAP_NAME("FillMeIn", "Map data type"), MAP_TYPE("FillMeIn",
"Defines whether this is a global or local map"), OBJ_FILE("TEMPLATEENTRY", null),
// keywords specific to facet mapping ancillary file
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"),
GSDINDX("-999", "[unk] Ground sample distance of OBJINDX"),
GSDINDXI("-999", "[unk] GSDINDX integer"),
// keywords specific to facet mapping ancillary file
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"), GSDINDX("-999",
"[unk] Ground sample distance of OBJINDX"), GSDINDXI("-999", "[unk] GSDINDX integer"),
// return this enum when no match is found
NOMATCH(null, "could not determine");
// return this enum when no match is found
NOMATCH(null, "could not determine");
// PIXPDEG(null,"pixels per degree","pixel/deg"),
// PIX_SZ(null, "mean size of pixels at equator (meters)","m");
// PIXPDEG(null,"pixels per degree","pixel/deg"),
// PIX_SZ(null, "mean size of pixels at equator (meters)","m");
private FitsValCom fitsValCom;
private FitsValCom fitsValCom;
private HeaderTag(String value, String comment) {
this.fitsValCom = new FitsValCom(value, comment);
}
public String value() {
return this.fitsValCom.getV();
}
public String comment() {
return this.fitsValCom.getC();
}
/**
* Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword
*/
public static final EnumSet<HeaderTag> fitsKeywords =
EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
public static final EnumSet<HeaderTag> globalDTMFitsData =
EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
/**
* Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found.
*
* @param value
* @return
*/
public static HeaderTag tagFromString(String value) {
for (HeaderTag tagName : values()) {
if (tagName.toString().equals(value)) {
return tagName;
}
private HeaderTag(String value, String comment) {
this.fitsValCom = new FitsValCom(value, comment);
}
return NOMATCH;
}
/**
* Return the "Source Data Product" (SDP) string to be used in a product file naming convention.
* The SDP string may not always be the same as the data source.
*
* For example, for ALTWG products, the dataSource could be "OLA" but the SDP string in the
* filename is supposed to be "alt".
*
* @param dataSource
* @return
*/
public static String getSDP(String dataSource) {
String sdp = dataSource;
if (dataSource.equals("ola")) {
sdp = "alt";
public String value() {
return this.fitsValCom.getV();
}
public String comment() {
return this.fitsValCom.getC();
}
/**
* Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword
*/
public static final EnumSet<HeaderTag> fitsKeywords = EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
public static final EnumSet<HeaderTag> globalDTMFitsData = EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
/**
* Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found.
*
* @param value
* @return
*/
public static HeaderTag tagFromString(String value) {
for (HeaderTag tagName : values()) {
if (tagName.toString().equals(value)) {
return tagName;
}
}
return NOMATCH;
}
/**
* Return the "Source Data Product" (SDP) string to be used in a product file naming convention.
* The SDP string may not always be the same as the data source.
*
* For example, for ALTWG products, the dataSource could be "OLA" but the SDP string in the
* filename is supposed to be "alt".
*
* @param dataSource
* @return
*/
public static String getSDP(String dataSource) {
String sdp = dataSource;
if (dataSource.equals("ola")) {
sdp = "alt";
}
return sdp;
}
return sdp;
}
}

View File

@@ -35,157 +35,150 @@ import terrasaur.utils.DTMHeader;
* HeaderCardFactory static methods. This is because the NFT MLN header updates are independent of
* updates to the ALTWG product fits headers or the Map Formats headers. static methods in the
* HeaderCardFactory
*
*
* @author espirrc1
*
*/
public class NFTmln extends DTMFits implements DTMHeader {
// private FitsData fitsData;
private boolean dataContained = false;
// private FitsData fitsData;
private boolean dataContained = false;
public NFTmln(FitsHdr fitsHeader) {
public NFTmln(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.NFTMLN);
}
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
List<HeaderCard> headerCards = new ArrayList<HeaderCard>();
headerCards.addAll(getHeaderInfo("Header Information"));
headerCards.addAll(getMissionInfo("Mission Information"));
headerCards.addAll(getIDInfo("Observation Information"));
// headerCards.addAll(getShapeSrcInfo());
headerCards.addAll(getMapDataSrc("Data Source"));
headerCards.addAll(getTimingInfo());
headerCards.addAll(getProcInfo("Processing Information"));
headerCards.addAll(getSpecificInfo("Product Specific Information"));
headerCards.addAll(getPlaneInfo("", planeList));
// end keyword
headerCards.add(getEnd());
return headerCards;
}
/**
* Custom header block.
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS));
return headers;
}
/**
* Fits header block containing information about the source data used for the shape model. This
* method does not exist in the parent class.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getShapeSrcInfo() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "Shape Data Source", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
return headers;
}
/**
* Custom source map data block.
*/
@Override
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
/**
* Custom product specific info data block
*
* @throws HeaderCardException
*/
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
super(fitsHeader, FitsHeaderType.NFTMLN);
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
headers.addAll(getCornerCards());
headers.addAll(getCenterVec());
headers.addAll(getUX());
headers.addAll(getUY());
headers.addAll(getUZ());
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
public List<HeaderCard> createFitsHeader(List<HeaderCard> planeList) throws HeaderCardException {
return headers;
List<HeaderCard> headerCards = new ArrayList<HeaderCard>();
}
headerCards.addAll(getHeaderInfo("Header Information"));
headerCards.addAll(getMissionInfo("Mission Information"));
headerCards.addAll(getIDInfo("Observation Information"));
// headerCards.addAll(getShapeSrcInfo());
headerCards.addAll(getMapDataSrc("Data Source"));
headerCards.addAll(getTimingInfo());
headerCards.addAll(getProcInfo("Processing Information"));
headerCards.addAll(getSpecificInfo("Product Specific Information"));
headerCards.addAll(getPlaneInfo("", planeList));
/**
* Fits header block containing information about when products were created. This method does not
* exist in the parent class.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getTimingInfo() throws HeaderCardException {
// end keyword
headerCards.add(getEnd());
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "Timing Information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD));
return headers;
}
/**
* Custom processing info block.
*/
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
return headerCards;
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODVERS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
return headers;
/**
* Custom header block.
*
* @return
* @throws HeaderCardException
*/
@Override
public List<HeaderCard> getHeaderInfo(String comment) throws HeaderCardException {
}
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS));
return headers;
}
/**
* Fits header block containing information about the source data used for the shape model. This
* method does not exist in the parent class.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getShapeSrcInfo() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "Shape Data Source", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
return headers;
}
/**
* Custom source map data block.
*/
@Override
public List<HeaderCard> getMapDataSrc(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRC));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCV));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFTWARE));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
/**
* Custom product specific info data block
*
* @throws HeaderCardException
*/
public List<HeaderCard> getSpecificInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLON));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.CLAT));
headers.addAll(getCornerCards());
headers.addAll(getCenterVec());
headers.addAll(getUX());
headers.addAll(getUY());
headers.addAll(getUZ());
headers.add(fitsHdr.getHeaderCardD(HeaderTag.GSD));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.SIGMA));
headers.add(fitsHdr.getHeaderCard(HeaderTag.SIG_DEF));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_1));
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
return headers;
}
/**
* Fits header block containing information about when products were created. This method does not
* exist in the parent class.
*
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getTimingInfo() throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
headers.add(new HeaderCard(COMMENT, "Timing Information", false));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCD));
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD));
return headers;
}
/**
* Custom processing info block.
*/
public List<HeaderCard> getProcInfo(String comment) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
headers.add(new HeaderCard(COMMENT, comment, false));
}
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODNAME));
headers.add(fitsHdr.getHeaderCard(HeaderTag.PRODVERS));
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
return headers;
}
}

View File

@@ -34,12 +34,12 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.TimeZone;
import org.apache.commons.io.FilenameUtils;
import nom.tam.fits.BasicHDU;
import nom.tam.fits.Fits;
import nom.tam.fits.FitsException;
import nom.tam.fits.HeaderCard;
import nom.tam.util.Cursor;
import org.apache.commons.io.FilenameUtils;
import terrasaur.enums.FitsHeaderType;
import terrasaur.enums.PlaneInfo;
import terrasaur.fits.FitsHdr.FitsHdrBuilder;
@@ -53,371 +53,360 @@ import terrasaur.utils.xml.AsciiFile;
*/
public class ProductFits {
public static final String PLANE = "PLANE";
public static final String COMMENT = "COMMENT";
public static final String PLANE = "PLANE";
public static final String COMMENT = "COMMENT";
/**
* Open the fits file and extract fits keywords which start with "PLANE". The convention is that
* these are the keywords which describe the planes in the fits datacube.
*
* @param fitsFile
* @return
* @throws FitsException
* @throws IOException
*/
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile)
throws FitsException, IOException {
Fits inFits = new Fits(fitsFile);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits);
return planeHeaders;
}
/**
* Open the fits object and extract fits keywords which start with "PLANE". The convention is that
* these are the keywords which describe the planes in the fits datacube.
*
* @param inFitsFile
* @return
* @throws FitsException
* @throws IOException
*/
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile)
throws FitsException, IOException {
BasicHDU inHdu = inFitsFile.getHDU(0);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu);
return planeHeaders;
}
/**
* Parse the HeaderDataUnit (HDU) and extract fits keywords which start with "PLANE". The
* convention is that these are the keywords which describe the planes in the fits datacube.
*
* @param inHdu
* @return
*/
public static List<HeaderCard> getPlaneHeaderCards(BasicHDU inHdu) {
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
Cursor cursor = inHdu.getHeader().iterator();
while (cursor.hasNext()) {
HeaderCard hc = (HeaderCard) cursor.next();
if (hc.getKey().startsWith(PLANE)) planeHeaders.add(hc);
}
return planeHeaders;
}
/**
* Parse fits header and determine min/max latitude and longitude. For global fits files will just
* parse keywords that directly contain the min/max lat/lon values. For regional fits files will
* parse the latlon corner keywords and determine min, max lat/lon values.
*
* @param fitsFile
* @return Map&lt;String, Double&gt; where string is the .toString() of HeaderTags MINLON, MAXLON,
* MINLAT, MAXLAT.
* @throws IOException
* @throws FitsException
*/
public static Map<String, Double> minMaxLLFromFits(File fitsFile)
throws FitsException, IOException {
System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath());
// initialize output
Map<String, Double> minMaxLL = new HashMap<String, Double>();
Map<String, HeaderCard> fitsHeaders = FitsUtil.getFitsHeaderAsMap(fitsFile.getAbsolutePath());
// check whether MINLON keyword exists.
String keyword = HeaderTag.MINLON.toString();
boolean failOnNull = false;
HeaderCard thisCard = FitsUtil.getCard(fitsHeaders, keyword, failOnNull);
if (thisCard == null) {
// assume LatLon corner keywords exist
// Map<String, HeaderCard> cornerCards = latLonCornersFromMap(fitsHeaders);
// iterate through Lat corners and find min/max values
double minLat = 999D;
double maxLat = -999D;
failOnNull = true;
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLAT.toString(), failOnNull);
double thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
// double check center lat. For polar observations the center latitude could be
// either the minimum or maximum latitude
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
minMaxLL.put(HeaderTag.MINLAT.toString(), minLat);
minMaxLL.put(HeaderTag.MAXLAT.toString(), maxLat);
// iterate through Lon corners and find min/max values
double minLon = 999D;
double maxLon = -999D;
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
// double check center lon
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLON.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
minMaxLL.put(HeaderTag.MINLON.toString(), minLon);
minMaxLL.put(HeaderTag.MAXLON.toString(), maxLon);
} else {
// assume min/max lat/lon keywords exist
// already queried for MINLON
double thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MINLON.toString(), thisVal);
// get MAXLON
failOnNull = true;
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLON.toString(), failOnNull);
thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MAXLON.toString(), thisVal);
// get MINLAT
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MINLAT.toString(), failOnNull);
thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MINLAT.toString(), thisVal);
// get MAXLAT
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLAT.toString(), failOnNull);
thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MAXLAT.toString(), thisVal);
}
return minMaxLL;
}
/**
* Parse and extract list of PlaneInfo enumerations that describe the planes contained in the fits
* file. Return empty list if none of the planes match.
*
* @param fitsFile
* @return
* @throws IOException
* @throws FitsException
*/
public static List<PlaneInfo> planesFromFits(String fitsFile) throws FitsException, IOException {
List<PlaneInfo> planes = new ArrayList<PlaneInfo>();
// extract header cards
List<HeaderCard> planeCards = getPlaneHeaderCards(fitsFile);
/*
* planeCards are assumed to follow the fits keyword naming convention of PLANEN, where N goes
* from 0 to the number of planes in the fits file - 1. They correspond directly to the order in
* which the data is stored in the data cube. For example, PLANE0 = "Latitude" indicates that
* the 0th plane in the fits datacube contains the latitude values.
/**
* Open the fits file and extract fits keywords which start with "PLANE". The convention is that
* these are the keywords which describe the planes in the fits datacube.
*
* @param fitsFile
* @return
* @throws FitsException
* @throws IOException
*/
for (HeaderCard thisPlane : planeCards) {
PlaneInfo thisPlaneInfo = PlaneInfo.keyVal2Plane(thisPlane.getValue());
if (thisPlaneInfo != null) {
planes.add(thisPlaneInfo);
}
}
return planes;
}
/**
* Save data to a Fits Datacube. Assumes hdrBuilder is pre-loaded with all the keyword values that
* can be known prior to this method. An example of a keyword value that cannot be known prior to
* this method is the ALTWG filename. This can only be created within this method by using
* metadata contained in the FitsHdrBuilder. The map information in fitsData will also be used to
* populate hdrBuilder if it is available. If it is not available then the method assumes the
* information is already in hdrBuilder. Ex. if fitsData was populated by reading a fits file then
* it may not have vector to center or unit vectors describing the reference frame. However, that
* information may be in the fits header, which should be captured by hdrBuilder.
*
* @param fitsData
* @param planeList
* @param outFname
* @param hdrBuilder
* @param hdrType
* @param crossrefFile - if not null then method will create a cross-reference file. The
* cross-reference file allows the pipeline to cross reference an original filenaming
* convention with the mission specific one.
* @throws FitsException
* @throws IOException
*/
public static void saveDataCubeFits(
FitsData fitsData,
List<PlaneInfo> planeList,
String outFname,
FitsHdrBuilder hdrBuilder,
FitsHeaderType hdrType,
File crossrefFile)
throws FitsException, IOException {
hdrBuilder.setByFitsData(fitsData);
String fitsFname = outFname;
if (crossrefFile != null) {
// save outfile name in cross-reference file, for future reference
String path = FilenameUtils.getFullPath(outFname);
if (path.length() == 0) path = ".";
String outBaseName =
String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath());
crfFile.streamSToFile(outBaseName, 0);
crfFile.closeFile();
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile) throws FitsException, IOException {
Fits inFits = new Fits(fitsFile);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits);
return planeHeaders;
}
// set date this product was produced
hdrBuilder.setDateprod();
DTMHeader fitsHeader = FitsHeaderFactory.getDTMHeader(hdrBuilder.build(), hdrType);
fitsHeader.setData(fitsData);
List<HeaderCard> headers = fitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
System.out.println("saving to fits file:" + fitsFname);
FitsUtil.saveFits(fitsData.getData(), fitsFname, headers);
}
/**
* General method for saving 3D double array with fits header as defined in fitsHdrbuilder.
* Assumes FitsHdrBuilder contains all the keywords that will be written to the fits header in the
* order that they should be written.
*
* @param dataCube
* @param hdrBuilder
* @throws IOException
* @throws FitsException
*/
public static void saveDataCubeFits(
double[][][] dataCube,
FitsHdrBuilder hdrBuilder,
List<HeaderCard> planeHeaders,
String outFile)
throws FitsException, IOException {
FitsHdr fitsHdr = hdrBuilder.build();
List<HeaderCard> headers = new ArrayList<HeaderCard>();
for (String keyword : fitsHdr.fitsKV.keySet()) {
headers.add(fitsHdr.fitsKV.get(keyword));
/**
* Open the fits object and extract fits keywords which start with "PLANE". The convention is that
* these are the keywords which describe the planes in the fits datacube.
*
* @param inFitsFile
* @return
* @throws FitsException
* @throws IOException
*/
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile) throws FitsException, IOException {
BasicHDU inHdu = inFitsFile.getHDU(0);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu);
return planeHeaders;
}
saveDataCubeFits(dataCube, headers, planeHeaders, outFile);
}
/**
* General method for writing to a fits file a 3D double array with fits headers as defined in
* List<HeaderCard> and planeKeywords as defined in separate List<HeaderCard>. plane keywords
* defining the planes of the dataCube in a separate List<HeaderCard>
*
* @param dataCube
* @param headers
* @param planeKeywords - can be null. If so then assumes headers contains all the keyword
* information to be written to the file.
* @param outFname
* @throws IOException
* @throws FitsException
*/
public static void saveDataCubeFits(
double[][][] dataCube,
List<HeaderCard> headers,
List<HeaderCard> planeKeywords,
String outFname)
throws FitsException, IOException {
// append planeHeaders
if (!planeKeywords.isEmpty()) {
headers.addAll(planeKeywords);
/**
* Parse the HeaderDataUnit (HDU) and extract fits keywords which start with "PLANE". The
* convention is that these are the keywords which describe the planes in the fits datacube.
*
* @param inHdu
* @return
*/
public static List<HeaderCard> getPlaneHeaderCards(BasicHDU inHdu) {
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
Cursor cursor = inHdu.getHeader().iterator();
while (cursor.hasNext()) {
HeaderCard hc = (HeaderCard) cursor.next();
if (hc.getKey().startsWith(PLANE)) planeHeaders.add(hc);
}
return planeHeaders;
}
FitsUtil.saveFits(dataCube, outFname, headers);
}
/**
* Save NFT MLN.
*
* @param fitsData
* @param planeList
* @param outfile
* @param hdrBuilder
* @param hdrType
* @param crossrefFile
* @throws FitsException
* @throws IOException
*/
public static void saveNftFits(
FitsData fitsData,
List<PlaneInfo> planeList,
String outfile,
FitsHdrBuilder hdrBuilder,
FitsHeaderType hdrType,
File crossrefFile)
throws FitsException, IOException {
/**
* Parse fits header and determine min/max latitude and longitude. For global fits files will just
* parse keywords that directly contain the min/max lat/lon values. For regional fits files will
* parse the latlon corner keywords and determine min, max lat/lon values.
*
* @param fitsFile
* @return Map&lt;String, Double&gt; where string is the .toString() of HeaderTags MINLON, MAXLON,
* MINLAT, MAXLAT.
* @throws IOException
* @throws FitsException
*/
public static Map<String, Double> minMaxLLFromFits(File fitsFile) throws FitsException, IOException {
// public static void saveDataCubeFits(FitsData fitsData, List<PlaneInfo> planeList, String
// outFname,
// FitsHdrBuilder hdrBuilder, FitsHeaderType hdrType,
// File crossrefFile) throws FitsException, IOException {
System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath());
String outNftFile = outfile;
Path outPath = Paths.get(outfile);
String nftFitsName = outPath.getFileName().toString();
// initialize output
Map<String, Double> minMaxLL = new HashMap<String, Double>();
// need to replace the product name in the headers list. No need to change comments.
hdrBuilder.setVbyHeaderTag(HeaderTag.PRODNAME, nftFitsName);
Map<String, HeaderCard> fitsHeaders = FitsUtil.getFitsHeaderAsMap(fitsFile.getAbsolutePath());
// set date this product was produced. Uses NFT specific keyword
DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");
dateFormat.setTimeZone(TimeZone.getTimeZone("GMT"));
Date date = new Date();
hdrBuilder.setVbyHeaderTag(HeaderTag.DATENPRD, dateFormat.format(date));
// check whether MINLON keyword exists.
String keyword = HeaderTag.MINLON.toString();
boolean failOnNull = false;
HeaderCard thisCard = FitsUtil.getCard(fitsHeaders, keyword, failOnNull);
if (thisCard == null) {
FitsHdr fitsHeader = hdrBuilder.build();
// assume LatLon corner keywords exist
// Map<String, HeaderCard> cornerCards = latLonCornersFromMap(fitsHeaders);
DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN);
nftFitsHeader.setData(fitsData);
// iterate through Lat corners and find min/max values
double minLat = 999D;
double maxLat = -999D;
failOnNull = true;
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLAT.toString(), failOnNull);
double thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
List<HeaderCard> headers =
nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
System.out.println("Saving to " + outNftFile);
FitsUtil.saveFits(fitsData.getData(), outNftFile, headers);
}
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
// double check center lat. For polar observations the center latitude could be
// either the minimum or maximum latitude
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLAT.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLat = Math.min(minLat, thisValue);
maxLat = Math.max(maxLat, thisValue);
minMaxLL.put(HeaderTag.MINLAT.toString(), minLat);
minMaxLL.put(HeaderTag.MAXLAT.toString(), maxLat);
// iterate through Lon corners and find min/max values
double minLon = 999D;
double maxLon = -999D;
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LLCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.ULCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.URCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.LRCLNG.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
// double check center lon
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.CLON.toString(), failOnNull);
thisValue = thisCard.getValue(Double.class, Double.NaN);
minLon = Math.min(minLon, thisValue);
maxLon = Math.max(maxLon, thisValue);
minMaxLL.put(HeaderTag.MINLON.toString(), minLon);
minMaxLL.put(HeaderTag.MAXLON.toString(), maxLon);
} else {
// assume min/max lat/lon keywords exist
// already queried for MINLON
double thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MINLON.toString(), thisVal);
// get MAXLON
failOnNull = true;
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLON.toString(), failOnNull);
thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MAXLON.toString(), thisVal);
// get MINLAT
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MINLAT.toString(), failOnNull);
thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MINLAT.toString(), thisVal);
// get MAXLAT
thisCard = FitsUtil.getCard(fitsHeaders, HeaderTag.MAXLAT.toString(), failOnNull);
thisVal = thisCard.getValue(Double.class, Double.NaN);
minMaxLL.put(HeaderTag.MAXLAT.toString(), thisVal);
}
return minMaxLL;
}
/**
* Parse and extract list of PlaneInfo enumerations that describe the planes contained in the fits
* file. Return empty list if none of the planes match.
*
* @param fitsFile
* @return
* @throws IOException
* @throws FitsException
*/
public static List<PlaneInfo> planesFromFits(String fitsFile) throws FitsException, IOException {
List<PlaneInfo> planes = new ArrayList<PlaneInfo>();
// extract header cards
List<HeaderCard> planeCards = getPlaneHeaderCards(fitsFile);
/*
* planeCards are assumed to follow the fits keyword naming convention of PLANEN, where N goes
* from 0 to the number of planes in the fits file - 1. They correspond directly to the order in
* which the data is stored in the data cube. For example, PLANE0 = "Latitude" indicates that
* the 0th plane in the fits datacube contains the latitude values.
*/
for (HeaderCard thisPlane : planeCards) {
PlaneInfo thisPlaneInfo = PlaneInfo.keyVal2Plane(thisPlane.getValue());
if (thisPlaneInfo != null) {
planes.add(thisPlaneInfo);
}
}
return planes;
}
/**
* Save data to a Fits Datacube. Assumes hdrBuilder is pre-loaded with all the keyword values that
* can be known prior to this method. An example of a keyword value that cannot be known prior to
* this method is the ALTWG filename. This can only be created within this method by using
* metadata contained in the FitsHdrBuilder. The map information in fitsData will also be used to
* populate hdrBuilder if it is available. If it is not available then the method assumes the
* information is already in hdrBuilder. Ex. if fitsData was populated by reading a fits file then
* it may not have vector to center or unit vectors describing the reference frame. However, that
* information may be in the fits header, which should be captured by hdrBuilder.
*
* @param fitsData
* @param planeList
* @param outFname
* @param hdrBuilder
* @param hdrType
* @param crossrefFile - if not null then method will create a cross-reference file. The
* cross-reference file allows the pipeline to cross reference an original filenaming
* convention with the mission specific one.
* @throws FitsException
* @throws IOException
*/
public static void saveDataCubeFits(
FitsData fitsData,
List<PlaneInfo> planeList,
String outFname,
FitsHdrBuilder hdrBuilder,
FitsHeaderType hdrType,
File crossrefFile)
throws FitsException, IOException {
hdrBuilder.setByFitsData(fitsData);
String fitsFname = outFname;
if (crossrefFile != null) {
// save outfile name in cross-reference file, for future reference
String path = FilenameUtils.getFullPath(outFname);
if (path.length() == 0) path = ".";
String outBaseName = String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath());
crfFile.streamSToFile(outBaseName, 0);
crfFile.closeFile();
}
// set date this product was produced
hdrBuilder.setDateprod();
DTMHeader fitsHeader = FitsHeaderFactory.getDTMHeader(hdrBuilder.build(), hdrType);
fitsHeader.setData(fitsData);
List<HeaderCard> headers = fitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
System.out.println("saving to fits file:" + fitsFname);
FitsUtil.saveFits(fitsData.getData(), fitsFname, headers);
}
/**
* General method for saving 3D double array with fits header as defined in fitsHdrbuilder.
* Assumes FitsHdrBuilder contains all the keywords that will be written to the fits header in the
* order that they should be written.
*
* @param dataCube
* @param hdrBuilder
* @throws IOException
* @throws FitsException
*/
public static void saveDataCubeFits(
double[][][] dataCube, FitsHdrBuilder hdrBuilder, List<HeaderCard> planeHeaders, String outFile)
throws FitsException, IOException {
FitsHdr fitsHdr = hdrBuilder.build();
List<HeaderCard> headers = new ArrayList<HeaderCard>();
for (String keyword : fitsHdr.fitsKV.keySet()) {
headers.add(fitsHdr.fitsKV.get(keyword));
}
saveDataCubeFits(dataCube, headers, planeHeaders, outFile);
}
/**
* General method for writing to a fits file a 3D double array with fits headers as defined in
* List<HeaderCard> and planeKeywords as defined in separate List<HeaderCard>. plane keywords
* defining the planes of the dataCube in a separate List<HeaderCard>
*
* @param dataCube
* @param headers
* @param planeKeywords - can be null. If so then assumes headers contains all the keyword
* information to be written to the file.
* @param outFname
* @throws IOException
* @throws FitsException
*/
public static void saveDataCubeFits(
double[][][] dataCube, List<HeaderCard> headers, List<HeaderCard> planeKeywords, String outFname)
throws FitsException, IOException {
// append planeHeaders
if (!planeKeywords.isEmpty()) {
headers.addAll(planeKeywords);
}
FitsUtil.saveFits(dataCube, outFname, headers);
}
/**
* Save NFT MLN.
*
* @param fitsData
* @param planeList
* @param outfile
* @param hdrBuilder
* @param hdrType
* @param crossrefFile
* @throws FitsException
* @throws IOException
*/
public static void saveNftFits(
FitsData fitsData,
List<PlaneInfo> planeList,
String outfile,
FitsHdrBuilder hdrBuilder,
FitsHeaderType hdrType,
File crossrefFile)
throws FitsException, IOException {
// public static void saveDataCubeFits(FitsData fitsData, List<PlaneInfo> planeList, String
// outFname,
// FitsHdrBuilder hdrBuilder, FitsHeaderType hdrType,
// File crossrefFile) throws FitsException, IOException {
String outNftFile = outfile;
Path outPath = Paths.get(outfile);
String nftFitsName = outPath.getFileName().toString();
// need to replace the product name in the headers list. No need to change comments.
hdrBuilder.setVbyHeaderTag(HeaderTag.PRODNAME, nftFitsName);
// set date this product was produced. Uses NFT specific keyword
DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");
dateFormat.setTimeZone(TimeZone.getTimeZone("GMT"));
Date date = new Date();
hdrBuilder.setVbyHeaderTag(HeaderTag.DATENPRD, dateFormat.format(date));
FitsHdr fitsHeader = hdrBuilder.build();
DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN);
nftFitsHeader.setData(fitsData);
List<HeaderCard> headers = nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
System.out.println("Saving to " + outNftFile);
FitsUtil.saveFits(fitsData.getData(), outNftFile, headers);
}
}

View File

@@ -23,26 +23,25 @@
package terrasaur.fits;
public enum UnitDir {
UX {
public int getAxis() {
return 1;
}
},
UX {
public int getAxis() {
return 1;
}
},
UY {
UY {
public int getAxis() {
return 2;
}
},
public int getAxis() {
return 2;
}
},
UZ {
UZ {
public int getAxis() {
return 3;
}
};
public int getAxis() {
return 3;
}
};
public abstract int getAxis();
public abstract int getAxis();
}

View File

@@ -36,118 +36,108 @@ import javafx.scene.control.ChoiceBox;
import javafx.scene.control.Label;
import javafx.scene.control.TextField;
import javafx.stage.Stage;
import spice.basic.SpiceException;
import terrasaur.apps.TranslateTime;
import terrasaur.utils.AppVersion;
import spice.basic.SpiceException;
public class TranslateTimeController implements Initializable {
private TranslateTime tt;
private TranslateTime tt;
public TranslateTimeController(Stage stage) {
this.tt = TranslateTimeFX.tt;
}
@Override
public void initialize(URL location, ResourceBundle resources) {
this.title.setText("Translate Time");
this.version.setText(AppVersion.getVersionString());
// populate SCLK menu
NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs);
this.sclkChoice.getItems().addAll(sclkIDs);
this.sclkChoice.getSelectionModel().selectedIndexProperty()
.addListener(new ChangeListener<Number>() {
@Override
public void changed(ObservableValue<? extends Number> observable, Number oldValue,
Number newValue) {
tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue));
}
});
this.sclkChoice.getSelectionModel().select(0);
try {
String zTime = ZonedDateTime.now(ZoneOffset.UTC).toString().strip();
// strip off the final "Z"
this.tt.setUTC(zTime.substring(0, zTime.length() - 1));
updateTime();
} catch (SpiceException e) {
e.printStackTrace();
public TranslateTimeController(Stage stage) {
this.tt = TranslateTimeFX.tt;
}
}
@FXML
private Label title;
@Override
public void initialize(URL location, ResourceBundle resources) {
this.title.setText("Translate Time");
this.version.setText(AppVersion.getVersionString());
@FXML
private Label version;
// populate SCLK menu
NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs);
this.sclkChoice.getItems().addAll(sclkIDs);
this.sclkChoice.getSelectionModel().selectedIndexProperty().addListener(new ChangeListener<Number>() {
@FXML
private TextField julianString;
@Override
public void changed(ObservableValue<? extends Number> observable, Number oldValue, Number newValue) {
tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue));
}
});
this.sclkChoice.getSelectionModel().select(0);
@FXML
private void setJulian() throws NumberFormatException, SpiceException {
if (julianString.getText().trim().length() > 0)
tt.setJulianDate(Double.parseDouble(julianString.getText()));
updateTime();
}
try {
String zTime = ZonedDateTime.now(ZoneOffset.UTC).toString().strip();
// strip off the final "Z"
this.tt.setUTC(zTime.substring(0, zTime.length() - 1));
updateTime();
} catch (SpiceException e) {
e.printStackTrace();
}
}
@FXML
private ChoiceBox<Integer> sclkChoice;
@FXML
private Label title;
@FXML
private TextField sclkString;
@FXML
private Label version;
@FXML
private void setSCLK() throws SpiceException {
if (sclkString.getText().trim().length() > 0)
tt.setSCLK(sclkString.getText());
updateTime();
}
@FXML
private TextField julianString;
@FXML
private TextField tdbString;
@FXML
private void setJulian() throws NumberFormatException, SpiceException {
if (julianString.getText().trim().length() > 0) tt.setJulianDate(Double.parseDouble(julianString.getText()));
updateTime();
}
@FXML
private void setTDB() throws SpiceException {
if (tdbString.getText().trim().length() > 0)
tt.setTDB(Double.parseDouble(tdbString.getText()));
updateTime();
}
@FXML
private ChoiceBox<Integer> sclkChoice;
@FXML
private TextField tdbCalendarString;
@FXML
private TextField sclkString;
@FXML
private void setTDBCalendar() throws SpiceException {
if (tdbCalendarString.getText().trim().length() > 0)
tt.setTDBCalendarString(tdbCalendarString.getText());
updateTime();
}
@FXML
private void setSCLK() throws SpiceException {
if (sclkString.getText().trim().length() > 0) tt.setSCLK(sclkString.getText());
updateTime();
}
@FXML
private TextField utcString;
@FXML
private TextField tdbString;
@FXML
private Label utcLabel;
@FXML
private void setTDB() throws SpiceException {
if (tdbString.getText().trim().length() > 0) tt.setTDB(Double.parseDouble(tdbString.getText()));
updateTime();
}
@FXML
private void setUTC() throws SpiceException {
if (utcString.getText().trim().length() > 0)
tt.setUTC(utcString.getText());
updateTime();
}
@FXML
private TextField tdbCalendarString;
private void updateTime() throws SpiceException {
julianString.setText(tt.toJulian());
sclkString.setText(tt.toSCLK().toString());
tdbString.setText(String.format("%.6f", tt.toTDB().getTDBSeconds()));
tdbCalendarString.setText(tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB"));
utcString.setText(tt.toTDB().toUTCString("ISOC", 3));
utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY")));
}
@FXML
private void setTDBCalendar() throws SpiceException {
if (tdbCalendarString.getText().trim().length() > 0) tt.setTDBCalendarString(tdbCalendarString.getText());
updateTime();
}
@FXML
private TextField utcString;
@FXML
private Label utcLabel;
@FXML
private void setUTC() throws SpiceException {
if (utcString.getText().trim().length() > 0) tt.setUTC(utcString.getText());
updateTime();
}
private void updateTime() throws SpiceException {
julianString.setText(tt.toJulian());
sclkString.setText(tt.toSCLK().toString());
tdbString.setText(String.format("%.6f", tt.toTDB().getTDBSeconds()));
tdbCalendarString.setText(tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB"));
utcString.setText(tt.toTDB().toUTCString("ISOC", 3));
utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY")));
}
}

View File

@@ -33,34 +33,33 @@ import terrasaur.apps.TranslateTime;
public class TranslateTimeFX extends Application {
// package private so it's visible from TranslateTimeController
static TranslateTime tt;
static Collection<Integer> sclkIDs;
// package private so it's visible from TranslateTimeController
static TranslateTime tt;
static Collection<Integer> sclkIDs;
public static void setSCLKIDs(Collection<Integer> sclkIDs) {
TranslateTimeFX.sclkIDs = sclkIDs;
}
public static void setSCLKIDs(Collection<Integer> sclkIDs) {
TranslateTimeFX.sclkIDs = sclkIDs;
}
public static void setTranslateTime(TranslateTime tt) {
TranslateTimeFX.tt = tt;
}
public static void setTranslateTime(TranslateTime tt) {
TranslateTimeFX.tt = tt;
}
public static void main(String[] args) {
launch(args);
Platform.exit();
}
public static void main(String[] args) {
launch(args);
Platform.exit();
}
@Override
public void start(Stage stage) throws Exception {
FXMLLoader loader = new FXMLLoader();
loader.setLocation(getClass().getResource("/terrasaur/gui/TranslateTime.fxml"));
TranslateTimeController controller = new TranslateTimeController(stage);
loader.setController(controller);
Parent root = loader.load();
Scene scene = new Scene(root);
stage.setScene(scene);
stage.show();
}
@Override
public void start(Stage stage) throws Exception {
FXMLLoader loader = new FXMLLoader();
loader.setLocation(getClass().getResource("/terrasaur/gui/TranslateTime.fxml"));
TranslateTimeController controller = new TranslateTimeController(stage);
loader.setController(controller);
Parent root = loader.load();
Scene scene = new Scene(root);
stage.setScene(scene);
stage.show();
}
}

View File

@@ -22,7 +22,6 @@
*/
package terrasaur.smallBodyModel;
import picante.math.intervals.Interval;
import picante.math.intervals.UnwritableInterval;
import picante.math.vectorspace.UnwritableVectorIJK;
@@ -31,226 +30,227 @@ import picante.math.vectorspace.VectorIJK;
/**
* The BoundingBox class is a data structure for storing the bounding box enclosing the a
* 3-dimensional box-shaped region.
*
*
* @author kahneg1
* @version 1.0
*
*/
public class BoundingBox {
private Interval xRange;
private Interval yRange;
private Interval zRange;
private Interval xRange;
private Interval yRange;
private Interval zRange;
/**
* Return a BoundingBox with each range set to {@link Interval#ALL_DOUBLES}.
*/
public BoundingBox() {
this(Interval.ALL_DOUBLES, Interval.ALL_DOUBLES, Interval.ALL_DOUBLES);
}
/**
* Return a bounding box with its X limits set to elements 0 and 1, Y limits set to elements 2 and
* 3, and Z limits set to elements 4 and 5 of the input array.
*
* @param bounds
*/
public BoundingBox(double[] bounds) {
this(new Interval(bounds[0], bounds[1]), new Interval(bounds[2], bounds[3]),
new Interval(bounds[4], bounds[5]));
}
/**
* Return a BoundingBox with the supplied dimensions.
*
* @param xRange
* @param yRange
* @param zRange
*/
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange,
UnwritableInterval zRange) {
this.xRange = new Interval(xRange);
this.yRange = new Interval(yRange);
this.zRange = new Interval(zRange);
}
/**
* Return a new BoundingBox with the same center as this instance but with each side's length
* scaled by scale.
*
* @param scale
* @return
*/
public BoundingBox getScaledBoundingBox(double scale) {
double center = xRange.getMiddle();
double length = xRange.getLength() * scale;
Interval newXRange = new Interval(center - length / 2, center + length / 2);
center = yRange.getMiddle();
length = yRange.getLength() * scale;
Interval newYRange = new Interval(center - length / 2, center + length / 2);
center = zRange.getMiddle();
length = zRange.getLength() * scale;
Interval newZRange = new Interval(center - length / 2, center + length / 2);
return new BoundingBox(newXRange, newYRange, newZRange);
}
/** Set the X dimension */
public void setXRange(UnwritableInterval range) {
this.xRange.setTo(range);
}
/** Set the Y dimension */
public void setYRange(UnwritableInterval range) {
this.yRange.setTo(range);
}
/** Set the Z dimension */
public void setZRange(UnwritableInterval range) {
this.zRange.setTo(range);
}
/** Return the X dimension */
public UnwritableInterval getXRange() {
return new UnwritableInterval(xRange);
}
/** Return the Y dimension */
public UnwritableInterval getYRange() {
return new UnwritableInterval(yRange);
}
/** Return the Z dimension */
public UnwritableInterval getZRange() {
return new UnwritableInterval(zRange);
}
/**
* Expand the bounding box to contain the supplied point if it is not already contained.
*
* @param point
*/
public void update(UnwritableVectorIJK point) {
xRange.set(Math.min(point.getI(), xRange.getBegin()), Math.max(point.getI(), xRange.getEnd()));
yRange.set(Math.min(point.getJ(), yRange.getBegin()), Math.max(point.getJ(), yRange.getEnd()));
zRange.set(Math.min(point.getK(), zRange.getBegin()), Math.max(point.getK(), zRange.getEnd()));
}
/**
* Check if this instance intersects the other. The intersection test in each dimension is
* {@link UnwritableInterval#closedIntersects(UnwritableInterval)}.
*
* @param other
* @return
*/
public boolean intersects(BoundingBox other) {
return xRange.closedIntersects(other.xRange) && yRange.closedIntersects(other.yRange)
&& zRange.closedIntersects(other.zRange);
}
/**
* @return the length of the largest side of the box
*/
public double getLargestSide() {
return Math.max(xRange.getLength(), Math.max(yRange.getLength(), zRange.getLength()));
}
/**
* @return the center of the box
*/
public VectorIJK getCenterPoint() {
return new VectorIJK(xRange.getMiddle(), yRange.getMiddle(), zRange.getMiddle());
}
/**
* @return the diagonal length of the box
*/
public double getDiagonalLength() {
VectorIJK vec = new VectorIJK(xRange.getLength(), yRange.getLength(), zRange.getLength());
return vec.getLength();
}
/**
* Returns whether or not the given point is contained in the box. The contains test in each
* dimension is {@link UnwritableInterval#closedContains(double)}.
*
* @param pt
* @return
*/
public boolean contains(UnwritableVectorIJK pt) {
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ())
&& zRange.closedContains(pt.getK());
}
/**
* Returns whether or not the given point is contained in the box. The contains test in each
* dimension is {@link UnwritableInterval#closedContains(double)}.
*
* @param pt
* @return
*/
public boolean contains(double[] pt) {
return contains(new VectorIJK(pt));
}
/**
* expand the X range by length on each side.
*
* @param length
*/
public void expandX(double length) {
xRange.set(xRange.getBegin() - length, xRange.getEnd() + length);
}
/**
* expand the Y range by length on each side.
*
* @param length
*/
public void expandY(double length) {
yRange.set(yRange.getBegin() - length, yRange.getEnd() + length);
}
/**
* expand the Z range by length on each side.
*
* @param length
*/
public void expandZ(double length) {
zRange.set(zRange.getBegin() - length, zRange.getEnd() + length);
}
/**
* Increase the size of the bounding box by adding (subtracting) to each side a specified
* percentage of the bounding box diagonal
*
* @param fractionOfDiagonalLength must be positive
*/
public void increaseSize(double fractionOfDiagonalLength) {
if (fractionOfDiagonalLength > 0.0) {
double size = fractionOfDiagonalLength * getDiagonalLength();
xRange.set(xRange.getBegin() - size, xRange.getEnd() + size);
yRange.set(yRange.getBegin() - size, yRange.getEnd() + size);
zRange.set(zRange.getBegin() - size, zRange.getEnd() + size);
/**
* Return a BoundingBox with each range set to {@link Interval#ALL_DOUBLES}.
*/
public BoundingBox() {
this(Interval.ALL_DOUBLES, Interval.ALL_DOUBLES, Interval.ALL_DOUBLES);
}
}
@Override
public String toString() {
return "xmin: " + xRange.getBegin() + " xmax: " + xRange.getEnd() + " ymin: "
+ yRange.getBegin() + " ymax: " + yRange.getEnd() + " zmin: " + zRange.getBegin()
+ " zmax: " + zRange.getEnd();
}
/**
* Return a bounding box with its X limits set to elements 0 and 1, Y limits set to elements 2 and
* 3, and Z limits set to elements 4 and 5 of the input array.
*
* @param bounds
*/
public BoundingBox(double[] bounds) {
this(
new Interval(bounds[0], bounds[1]),
new Interval(bounds[2], bounds[3]),
new Interval(bounds[4], bounds[5]));
}
@Override
public boolean equals(Object obj) {
BoundingBox b = (BoundingBox) obj;
return xRange.equals(b.xRange) && yRange.equals(b.yRange) && zRange.equals(b.zRange);
}
/**
* Return a BoundingBox with the supplied dimensions.
*
* @param xRange
* @param yRange
* @param zRange
*/
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange, UnwritableInterval zRange) {
this.xRange = new Interval(xRange);
this.yRange = new Interval(yRange);
this.zRange = new Interval(zRange);
}
@Override
public Object clone() {
return new BoundingBox(xRange, yRange, zRange);
}
/**
* Return a new BoundingBox with the same center as this instance but with each side's length
* scaled by scale.
*
* @param scale
* @return
*/
public BoundingBox getScaledBoundingBox(double scale) {
double center = xRange.getMiddle();
double length = xRange.getLength() * scale;
Interval newXRange = new Interval(center - length / 2, center + length / 2);
center = yRange.getMiddle();
length = yRange.getLength() * scale;
Interval newYRange = new Interval(center - length / 2, center + length / 2);
center = zRange.getMiddle();
length = zRange.getLength() * scale;
Interval newZRange = new Interval(center - length / 2, center + length / 2);
return new BoundingBox(newXRange, newYRange, newZRange);
}
/** Set the X dimension */
public void setXRange(UnwritableInterval range) {
this.xRange.setTo(range);
}
/** Set the Y dimension */
public void setYRange(UnwritableInterval range) {
this.yRange.setTo(range);
}
/** Set the Z dimension */
public void setZRange(UnwritableInterval range) {
this.zRange.setTo(range);
}
/** Return the X dimension */
public UnwritableInterval getXRange() {
return new UnwritableInterval(xRange);
}
/** Return the Y dimension */
public UnwritableInterval getYRange() {
return new UnwritableInterval(yRange);
}
/** Return the Z dimension */
public UnwritableInterval getZRange() {
return new UnwritableInterval(zRange);
}
/**
* Expand the bounding box to contain the supplied point if it is not already contained.
*
* @param point
*/
public void update(UnwritableVectorIJK point) {
xRange.set(Math.min(point.getI(), xRange.getBegin()), Math.max(point.getI(), xRange.getEnd()));
yRange.set(Math.min(point.getJ(), yRange.getBegin()), Math.max(point.getJ(), yRange.getEnd()));
zRange.set(Math.min(point.getK(), zRange.getBegin()), Math.max(point.getK(), zRange.getEnd()));
}
/**
* Check if this instance intersects the other. The intersection test in each dimension is
* {@link UnwritableInterval#closedIntersects(UnwritableInterval)}.
*
* @param other
* @return
*/
public boolean intersects(BoundingBox other) {
return xRange.closedIntersects(other.xRange)
&& yRange.closedIntersects(other.yRange)
&& zRange.closedIntersects(other.zRange);
}
/**
* @return the length of the largest side of the box
*/
public double getLargestSide() {
return Math.max(xRange.getLength(), Math.max(yRange.getLength(), zRange.getLength()));
}
/**
* @return the center of the box
*/
public VectorIJK getCenterPoint() {
return new VectorIJK(xRange.getMiddle(), yRange.getMiddle(), zRange.getMiddle());
}
/**
* @return the diagonal length of the box
*/
public double getDiagonalLength() {
VectorIJK vec = new VectorIJK(xRange.getLength(), yRange.getLength(), zRange.getLength());
return vec.getLength();
}
/**
* Returns whether or not the given point is contained in the box. The contains test in each
* dimension is {@link UnwritableInterval#closedContains(double)}.
*
* @param pt
* @return
*/
public boolean contains(UnwritableVectorIJK pt) {
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ()) && zRange.closedContains(pt.getK());
}
/**
* Returns whether or not the given point is contained in the box. The contains test in each
* dimension is {@link UnwritableInterval#closedContains(double)}.
*
* @param pt
* @return
*/
public boolean contains(double[] pt) {
return contains(new VectorIJK(pt));
}
/**
* expand the X range by length on each side.
*
* @param length
*/
public void expandX(double length) {
xRange.set(xRange.getBegin() - length, xRange.getEnd() + length);
}
/**
* expand the Y range by length on each side.
*
* @param length
*/
public void expandY(double length) {
yRange.set(yRange.getBegin() - length, yRange.getEnd() + length);
}
/**
* expand the Z range by length on each side.
*
* @param length
*/
public void expandZ(double length) {
zRange.set(zRange.getBegin() - length, zRange.getEnd() + length);
}
/**
* Increase the size of the bounding box by adding (subtracting) to each side a specified
* percentage of the bounding box diagonal
*
* @param fractionOfDiagonalLength must be positive
*/
public void increaseSize(double fractionOfDiagonalLength) {
if (fractionOfDiagonalLength > 0.0) {
double size = fractionOfDiagonalLength * getDiagonalLength();
xRange.set(xRange.getBegin() - size, xRange.getEnd() + size);
yRange.set(yRange.getBegin() - size, yRange.getEnd() + size);
zRange.set(zRange.getBegin() - size, zRange.getEnd() + size);
}
}
@Override
public String toString() {
return "xmin: " + xRange.getBegin() + " xmax: " + xRange.getEnd() + " ymin: "
+ yRange.getBegin() + " ymax: " + yRange.getEnd() + " zmin: " + zRange.getBegin()
+ " zmax: " + zRange.getEnd();
}
@Override
public boolean equals(Object obj) {
BoundingBox b = (BoundingBox) obj;
return xRange.equals(b.xRange) && yRange.equals(b.yRange) && zRange.equals(b.zRange);
}
@Override
public Object clone() {
return new BoundingBox(xRange, yRange, zRange);
}
}

View File

@@ -22,6 +22,7 @@
*/
package terrasaur.smallBodyModel;
import com.google.common.collect.HashMultimap;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
@@ -34,174 +35,174 @@ import org.apache.commons.math3.geometry.euclidean.threed.Rotation;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import com.google.common.collect.HashMultimap;
import picante.math.vectorspace.VectorIJK;
import terrasaur.utils.math.MathConversions;
import terrasaur.utils.PolyDataStatistics;
import terrasaur.utils.PolyDataUtil;
import terrasaur.utils.math.MathConversions;
import terrasaur.utils.tessellation.FibonacciSphere;
import vtk.vtkPoints;
import vtk.vtkPolyData;
/**
* Hold a collection of local shape models
*
*
* @author Hari.Nair@jhuapl.edu
*
*/
public class LocalModelCollection {
private final static Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
static class LocalModel {
final Vector3D center;
final String filename;
static class LocalModel {
final Vector3D center;
final String filename;
LocalModel(Vector3D center, String filename) {
this.center = center;
this.filename = filename;
LocalModel(Vector3D center, String filename) {
this.center = center;
this.filename = filename;
}
}
}
// key is tile index, value is collection of localModels
private HashMultimap<Long, LocalModel> localModelMap;
private FibonacciSphere tessellation;
// key is filename, value is shape model
private ThreadLocal<Map<String, SmallBodyModel>> localModels;
// key is tile index, value is collection of localModels
private HashMultimap<Long, LocalModel> localModelMap;
private FibonacciSphere tessellation;
// key is filename, value is shape model
private ThreadLocal<Map<String, SmallBodyModel>> localModels;
private Double scale;
private Rotation rotation;
private Double scale;
private Rotation rotation;
/**
*
* @param numTiles total number of tiles to use for sorting local models
*/
public LocalModelCollection(int numTiles, Double scale, Rotation rotation) {
localModelMap = HashMultimap.create();
tessellation = new FibonacciSphere(numTiles);
localModels = new ThreadLocal<>();
this.scale = scale;
this.rotation = rotation;
}
/**
* Add a shape model. Models are stored in a map with the center as the key, so an entry with the
* same center as an existing entry will overwrite the existing one.
*
* @param latInRadians
* @param lonInRadians
* @param filename
*/
public void addModel(double latInRadians, double lonInRadians, String filename) {
Vector3D center = new Vector3D(lonInRadians, latInRadians);
long tileIndex = tessellation.getTileIndex(MathConversions.toVectorIJK(center));
LocalModel lm = new LocalModel(center, filename);
localModelMap.put(tileIndex, lm);
}
/**
* Return a shape model containing the supplied point. This may not be the only shape model that
* contains this point, just the first one found.
*
* @param point
* @return
*/
public SmallBodyModel get(Vector3D point) {
List<String> filenames = getFilenames(point);
if (filenames.size() == 0)
logger.error("No shape models cover {}", point.toString());
double[] origin = new double[3];
double[] intersectPoint = new double[3];
for (String filename : filenames) {
SmallBodyModel sbm = load(filename);
long intersect =
sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
if (intersect != -1)
return sbm;
/**
*
* @param numTiles total number of tiles to use for sorting local models
*/
public LocalModelCollection(int numTiles, Double scale, Rotation rotation) {
localModelMap = HashMultimap.create();
tessellation = new FibonacciSphere(numTiles);
localModels = new ThreadLocal<>();
this.scale = scale;
this.rotation = rotation;
}
logger.debug("Failed intersection for lon {}, lat {}", Math.toDegrees(point.getAlpha()),
Math.toDegrees(point.getDelta()));
return null;
}
/**
* Load a shape model after applying any rotation or scaling
*
* @param filename
* @return
*/
private SmallBodyModel load(String filename) {
Map<String, SmallBodyModel> map = localModels.get();
if (map == null) {
map = new HashMap<>();
localModels.set(map);
/**
* Add a shape model. Models are stored in a map with the center as the key, so an entry with the
* same center as an existing entry will overwrite the existing one.
*
* @param latInRadians
* @param lonInRadians
* @param filename
*/
public void addModel(double latInRadians, double lonInRadians, String filename) {
Vector3D center = new Vector3D(lonInRadians, latInRadians);
long tileIndex = tessellation.getTileIndex(MathConversions.toVectorIJK(center));
LocalModel lm = new LocalModel(center, filename);
localModelMap.put(tileIndex, lm);
}
SmallBodyModel sbm = map.get(filename);
if (sbm == null) {
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(),
FilenameUtils.getBaseName(filename));
try {
vtkPolyData model = PolyDataUtil.loadShapeModel(filename);
if (scale != null || rotation != null) {
PolyDataStatistics stats = new PolyDataStatistics(model);
Vector3D center = new Vector3D(stats.getCentroid());
/**
* Return a shape model containing the supplied point. This may not be the only shape model that
* contains this point, just the first one found.
*
* @param point
* @return
*/
public SmallBodyModel get(Vector3D point) {
List<String> filenames = getFilenames(point);
if (filenames.size() == 0) logger.error("No shape models cover {}", point.toString());
double[] origin = new double[3];
double[] intersectPoint = new double[3];
for (String filename : filenames) {
SmallBodyModel sbm = load(filename);
long intersect = sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
vtkPoints points = model.GetPoints();
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3D thisPoint = new Vector3D(points.GetPoint(i));
if (scale != null)
thisPoint = thisPoint.subtract(center).scalarMultiply(scale).add(center);
if (rotation != null)
thisPoint = rotation.applyTo(thisPoint.subtract(center)).add(center);
points.SetPoint(i, thisPoint.toArray());
}
if (intersect != -1) return sbm;
}
logger.debug(
"Failed intersection for lon {}, lat {}",
Math.toDegrees(point.getAlpha()),
Math.toDegrees(point.getDelta()));
return null;
}
/**
* Load a shape model after applying any rotation or scaling
*
* @param filename
* @return
*/
private SmallBodyModel load(String filename) {
Map<String, SmallBodyModel> map = localModels.get();
if (map == null) {
map = new HashMap<>();
localModels.set(map);
}
SmallBodyModel sbm = map.get(filename);
if (sbm == null) {
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(), FilenameUtils.getBaseName(filename));
try {
vtkPolyData model = PolyDataUtil.loadShapeModel(filename);
if (scale != null || rotation != null) {
PolyDataStatistics stats = new PolyDataStatistics(model);
Vector3D center = new Vector3D(stats.getCentroid());
vtkPoints points = model.GetPoints();
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3D thisPoint = new Vector3D(points.GetPoint(i));
if (scale != null)
thisPoint = thisPoint
.subtract(center)
.scalarMultiply(scale)
.add(center);
if (rotation != null)
thisPoint =
rotation.applyTo(thisPoint.subtract(center)).add(center);
points.SetPoint(i, thisPoint.toArray());
}
}
sbm = new SmallBodyModel(model);
} catch (Exception e) {
logger.error(e.getLocalizedMessage());
}
map.put(filename, sbm);
}
return map.get(filename);
}
/**
* Return the local model with the closest center to point
*
* @param point
* @return null if no models have been loaded
*/
private List<String> getFilenames(Vector3D point) {
VectorIJK ijk = MathConversions.toVectorIJK(point);
// A sorted map of tiles by distance
NavigableMap<Double, Integer> distanceMap = tessellation.getDistanceMap(ijk);
List<String> smallBodyModels = new ArrayList<>();
for (Double dist : distanceMap.keySet()) {
// A set of local models with centers in this tile
Set<LocalModel> localModelSet = localModelMap.get((long) distanceMap.get(dist));
if (localModelSet.size() > 0) {
NavigableMap<Double, LocalModel> localDistanceMap = new TreeMap<>();
for (LocalModel localModel : localModelSet) {
double thisDist = Vector3D.angle(localModel.center, point);
localDistanceMap.put(thisDist, localModel);
}
// add all local models with centers within PI/4 of point
for (double localDist :
localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
smallBodyModels.add(localDistanceMap.get(localDist).filename);
}
}
}
sbm = new SmallBodyModel(model);
} catch (Exception e) {
logger.error(e.getLocalizedMessage());
}
map.put(filename, sbm);
return smallBodyModels;
}
return map.get(filename);
}
/**
* Return the local model with the closest center to point
*
* @param point
* @return null if no models have been loaded
*/
private List<String> getFilenames(Vector3D point) {
VectorIJK ijk = MathConversions.toVectorIJK(point);
// A sorted map of tiles by distance
NavigableMap<Double, Integer> distanceMap = tessellation.getDistanceMap(ijk);
List<String> smallBodyModels = new ArrayList<>();
for (Double dist : distanceMap.keySet()) {
// A set of local models with centers in this tile
Set<LocalModel> localModelSet = localModelMap.get((long) distanceMap.get(dist));
if (localModelSet.size() > 0) {
NavigableMap<Double, LocalModel> localDistanceMap = new TreeMap<>();
for (LocalModel localModel : localModelSet) {
double thisDist = Vector3D.angle(localModel.center, point);
localDistanceMap.put(thisDist, localModel);
}
// add all local models with centers within PI/4 of point
for (double localDist : localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
smallBodyModels.add(localDistanceMap.get(localDist).filename);
}
}
}
return smallBodyModels;
}
}

View File

@@ -28,124 +28,122 @@ import org.immutables.value.Value;
import terrasaur.smallBodyModel.ImmutableSBMTStructure.Builder;
/**
*
*
* <pre>
# SBMT Structure File
# type,point
# ------------------------------------------------------------------------------
# File consists of a list of structures on each line.
#
# Each line is defined by 17 columns with the following:
# &lt;id&gt; &lt;name&gt; &lt;centerXYZ[3]&gt; &lt;centerLLR[3]&gt; &lt;coloringValue[4]&gt; &lt;diameter&gt; &lt;flattening&gt; &lt;regularAngle&gt; &lt;colorRGB&gt; &lt;label&gt;*
#
# id: Id of the structure
# name: Name of the structure
# centerXYZ[3]: 3 columns that define the structure center in 3D space
# centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
# coloringValue[4]: 4 columns that define the ellipse &#8220;standard&#8221; colorings. The
# colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
# diameter: Diameter of (semimajor) axis of ellipse
# flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
# regularAngle: Angle between the semimajor axis and the line of longitude
# as projected onto the surface
# colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
# spaces). This column appears as a single textual column.
# label: Label of the structure
#
#
# Please note the following:
# - Each line is composed of columns separated by a tab character.
# - Blank lines or lines that start with '#' are ignored.
# - Angle units: degrees
# - Length units: kilometers
* # SBMT Structure File
* # type,point
* # ------------------------------------------------------------------------------
* # File consists of a list of structures on each line.
* #
* # Each line is defined by 17 columns with the following:
* # &lt;id&gt; &lt;name&gt; &lt;centerXYZ[3]&gt; &lt;centerLLR[3]&gt; &lt;coloringValue[4]&gt; &lt;diameter&gt; &lt;flattening&gt; &lt;regularAngle&gt; &lt;colorRGB&gt; &lt;label&gt;*
* #
* # id: Id of the structure
* # name: Name of the structure
* # centerXYZ[3]: 3 columns that define the structure center in 3D space
* # centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
* # coloringValue[4]: 4 columns that define the ellipse &#8220;standard&#8221; colorings. The
* # colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
* # diameter: Diameter of (semimajor) axis of ellipse
* # flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
* # regularAngle: Angle between the semimajor axis and the line of longitude
* # as projected onto the surface
* # colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
* # spaces). This column appears as a single textual column.
* # label: Label of the structure
* #
* #
* # Please note the following:
* # - Each line is composed of columns separated by a tab character.
* # - Blank lines or lines that start with '#' are ignored.
* # - Angle units: degrees
* # - Length units: kilometers
* </pre>
*
*
* @author Hari.Nair@jhuapl.edu
*
*/
@Value.Immutable
public abstract class SBMTStructure {
abstract int id();
abstract int id();
abstract String name();
abstract String name();
public abstract Vector3D centerXYZ();
public abstract Vector3D centerXYZ();
abstract String slopeColoring();
abstract String slopeColoring();
abstract String elevationColoring();
abstract String elevationColoring();
abstract String accelerationColoring();
abstract String accelerationColoring();
abstract String potentialColoring();
abstract String potentialColoring();
abstract double diameter();
abstract double diameter();
abstract double flattening();
abstract double flattening();
abstract double regularAngle();
abstract double regularAngle();
abstract Color rgb();
abstract Color rgb();
abstract String label();
abstract String label();
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d\t", id()));
sb.append(String.format("%s\t", name()));
sb.append(String.format("%.16f\t", centerXYZ().getX()));
sb.append(String.format("%.16f\t", centerXYZ().getY()));
sb.append(String.format("%.16f\t", centerXYZ().getZ()));
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getDelta())));
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getAlpha())));
sb.append(String.format("%.16f\t", centerXYZ().getNorm()));
sb.append(String.format("%s\t", slopeColoring()));
sb.append(String.format("%s\t", elevationColoring()));
sb.append(String.format("%s\t", accelerationColoring()));
sb.append(String.format("%s\t", potentialColoring()));
sb.append(String.format("%f\t", diameter()));
sb.append(String.format("%f\t", flattening()));
sb.append(String.format("%f\t", regularAngle()));
sb.append(String.format("%d,%d,%d\t", rgb().getRed(), rgb().getGreen(), rgb().getBlue()));
sb.append(label());
return sb.toString();
}
public static SBMTStructure fromString(String line) {
String[] parts = line.split("\\s+");
int id = Integer.parseInt(parts[0]);
String name = parts[1];
Vector3D centerXYZ = new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]),
Double.parseDouble(parts[4]));
String slopeColoring = parts[8];
String elevationColoring = parts[9];
String accelerationColoring = parts[10];
String potentialColoring = parts[11];
double diameter = Double.parseDouble(parts[12]);
double flattening = Double.parseDouble(parts[13]);
double regularAngle = Double.parseDouble(parts[14]);
String[] colorParts = parts[15].split(",");
Color rgb = new Color(Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]),
Integer.parseInt(colorParts[2]));
String label = parts[16];
Builder builder = ImmutableSBMTStructure.builder();
builder.id(id);
builder.name(name);
builder.centerXYZ(centerXYZ);
builder.slopeColoring(slopeColoring);
builder.elevationColoring(elevationColoring);
builder.accelerationColoring(accelerationColoring);
builder.potentialColoring(potentialColoring);
builder.diameter(diameter);
builder.flattening(flattening);
builder.regularAngle(regularAngle);
builder.rgb(rgb);
builder.label(label);
return builder.build();
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d\t", id()));
sb.append(String.format("%s\t", name()));
sb.append(String.format("%.16f\t", centerXYZ().getX()));
sb.append(String.format("%.16f\t", centerXYZ().getY()));
sb.append(String.format("%.16f\t", centerXYZ().getZ()));
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getDelta())));
sb.append(String.format("%.16f\t", Math.toDegrees(centerXYZ().getAlpha())));
sb.append(String.format("%.16f\t", centerXYZ().getNorm()));
sb.append(String.format("%s\t", slopeColoring()));
sb.append(String.format("%s\t", elevationColoring()));
sb.append(String.format("%s\t", accelerationColoring()));
sb.append(String.format("%s\t", potentialColoring()));
sb.append(String.format("%f\t", diameter()));
sb.append(String.format("%f\t", flattening()));
sb.append(String.format("%f\t", regularAngle()));
sb.append(String.format("%d,%d,%d\t", rgb().getRed(), rgb().getGreen(), rgb().getBlue()));
sb.append(label());
return sb.toString();
}
public static SBMTStructure fromString(String line) {
String[] parts = line.split("\\s+");
int id = Integer.parseInt(parts[0]);
String name = parts[1];
Vector3D centerXYZ =
new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]), Double.parseDouble(parts[4]));
String slopeColoring = parts[8];
String elevationColoring = parts[9];
String accelerationColoring = parts[10];
String potentialColoring = parts[11];
double diameter = Double.parseDouble(parts[12]);
double flattening = Double.parseDouble(parts[13]);
double regularAngle = Double.parseDouble(parts[14]);
String[] colorParts = parts[15].split(",");
Color rgb = new Color(
Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]), Integer.parseInt(colorParts[2]));
String label = parts[16];
Builder builder = ImmutableSBMTStructure.builder();
builder.id(id);
builder.name(name);
builder.centerXYZ(centerXYZ);
builder.slopeColoring(slopeColoring);
builder.elevationColoring(elevationColoring);
builder.accelerationColoring(accelerationColoring);
builder.potentialColoring(potentialColoring);
builder.diameter(diameter);
builder.flattening(flattening);
builder.regularAngle(regularAngle);
builder.rgb(rgb);
builder.label(label);
return builder.build();
}
}

View File

@@ -32,175 +32,172 @@ import vtk.vtkPolyData;
/**
* This class is used to subdivide the bounding box of a shape model into a contiguous grid of 3D
* cubes (sort of like voxels).
*
*
* @author kahneg1
* @version 1.0
*
*/
public class SmallBodyCubes {
private final static Logger logger = LogManager.getLogger(SmallBodyCubes.class);
private static final Logger logger = LogManager.getLogger(SmallBodyCubes.class);
private BoundingBox boundingBox;
private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>();
private final double cubeSize;
private final double buffer;
private BoundingBox boundingBox;
private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>();
private final double cubeSize;
private final double buffer;
/**
* Create a cube set structure for the given model, where each cube has side <tt>cubeSize</tt> and
* <tt>buffer</tt> is added to all sides of the bounding box of the model. Cubes that do not
* intersect the asteroid at all are removed.
*
* @param smallBodyPolyData
* @param cubeSize
* @param buffer
*/
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer,
boolean removeEmptyCubes) {
this.cubeSize = cubeSize;
this.buffer = buffer;
/**
* Create a cube set structure for the given model, where each cube has side <tt>cubeSize</tt> and
* <tt>buffer</tt> is added to all sides of the bounding box of the model. Cubes that do not
* intersect the asteroid at all are removed.
*
* @param smallBodyPolyData
* @param cubeSize
* @param buffer
*/
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer, boolean removeEmptyCubes) {
this.cubeSize = cubeSize;
this.buffer = buffer;
initialize(smallBodyPolyData);
initialize(smallBodyPolyData);
if (removeEmptyCubes)
removeEmptyCubes(smallBodyPolyData);
}
if (removeEmptyCubes) removeEmptyCubes(smallBodyPolyData);
}
private void initialize(vtkPolyData smallBodyPolyData) {
smallBodyPolyData.ComputeBounds();
double[] bounds = smallBodyPolyData.GetBounds();
boundingBox = new BoundingBox(new Interval(bounds[0] - buffer, bounds[1] + buffer),
new Interval(bounds[2] - buffer, bounds[3] + buffer),
new Interval(bounds[4] - buffer, bounds[5] + buffer));
private void initialize(vtkPolyData smallBodyPolyData) {
smallBodyPolyData.ComputeBounds();
double[] bounds = smallBodyPolyData.GetBounds();
boundingBox = new BoundingBox(
new Interval(bounds[0] - buffer, bounds[1] + buffer),
new Interval(bounds[2] - buffer, bounds[3] + buffer),
new Interval(bounds[4] - buffer, bounds[5] + buffer));
int numCubesX = (int) Math.ceil(boundingBox.getXRange().getLength() / cubeSize);
int numCubesY = (int) Math.ceil(boundingBox.getYRange().getLength() / cubeSize);
int numCubesZ = (int) Math.ceil(boundingBox.getZRange().getLength() / cubeSize);
int numCubesX = (int) Math.ceil(boundingBox.getXRange().getLength() / cubeSize);
int numCubesY = (int) Math.ceil(boundingBox.getYRange().getLength() / cubeSize);
int numCubesZ = (int) Math.ceil(boundingBox.getZRange().getLength() / cubeSize);
for (int k = 0; k < numCubesZ; ++k) {
double zmin = boundingBox.getZRange().getBegin() + k * cubeSize;
double zmax = boundingBox.getZRange().getBegin() + (k + 1) * cubeSize;
for (int j = 0; j < numCubesY; ++j) {
double ymin = boundingBox.getYRange().getBegin() + j * cubeSize;
double ymax = boundingBox.getYRange().getBegin() + (j + 1) * cubeSize;
for (int i = 0; i < numCubesX; ++i) {
double xmin = boundingBox.getXRange().getBegin() + i * cubeSize;
double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize;
BoundingBox bb = new BoundingBox(new Interval(xmin, xmax), new Interval(ymin, ymax),
new Interval(zmin, zmax));
allCubes.add(bb);
for (int k = 0; k < numCubesZ; ++k) {
double zmin = boundingBox.getZRange().getBegin() + k * cubeSize;
double zmax = boundingBox.getZRange().getBegin() + (k + 1) * cubeSize;
for (int j = 0; j < numCubesY; ++j) {
double ymin = boundingBox.getYRange().getBegin() + j * cubeSize;
double ymax = boundingBox.getYRange().getBegin() + (j + 1) * cubeSize;
for (int i = 0; i < numCubesX; ++i) {
double xmin = boundingBox.getXRange().getBegin() + i * cubeSize;
double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize;
BoundingBox bb = new BoundingBox(
new Interval(xmin, xmax), new Interval(ymin, ymax), new Interval(zmin, zmax));
allCubes.add(bb);
}
}
}
}
}
}
private void removeEmptyCubes(vtkPolyData smallBodyPolyData) {
logger.info("total cubes before reduction = {}", allCubes.size());
// Remove from allCubes all cubes that do not intersect the asteroid
// long t0 = System.currentTimeMillis();
TreeSet<Integer> intersectingCubes = getIntersectingCubes(smallBodyPolyData);
// System.out.println("Time elapsed: " +
// ((double)System.currentTimeMillis()-t0)/1000.0);
ArrayList<BoundingBox> tmpCubes = new ArrayList<BoundingBox>();
for (Integer i : intersectingCubes) {
tmpCubes.add(allCubes.get(i));
}
allCubes = tmpCubes;
private void removeEmptyCubes(vtkPolyData smallBodyPolyData) {
logger.info("total cubes before reduction = {}", allCubes.size());
logger.info("finished initializing cubes, total = {}", allCubes.size());
}
// Remove from allCubes all cubes that do not intersect the asteroid
// long t0 = System.currentTimeMillis();
TreeSet<Integer> intersectingCubes = getIntersectingCubes(smallBodyPolyData);
// System.out.println("Time elapsed: " +
// ((double)System.currentTimeMillis()-t0)/1000.0);
public BoundingBox getCube(int cubeId) {
return allCubes.get(cubeId);
}
ArrayList<BoundingBox> tmpCubes = new ArrayList<BoundingBox>();
for (Integer i : intersectingCubes) {
tmpCubes.add(allCubes.get(i));
}
/**
* Get all the cubes that intersect with <tt>polydata</tt>
*
* @param polydata
* @return
*/
public TreeSet<Integer> getIntersectingCubes(vtkPolyData polydata) {
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
allCubes = tmpCubes;
// Iterate through each cube and check if it intersects
// with the bounding box of any of the polygons of the polydata
BoundingBox polydataBB = new BoundingBox(polydata.GetBounds());
long numberPolygons = polydata.GetNumberOfCells();
// Store all the bounding boxes of all the individual polygons in an
// array first
// since the call to GetCellBounds is very slow.
double[] cellBounds = new double[6];
ArrayList<BoundingBox> polyCellsBB = new ArrayList<BoundingBox>();
for (int j = 0; j < numberPolygons; ++j) {
polydata.GetCellBounds(j, cellBounds);
polyCellsBB.add(new BoundingBox(cellBounds));
logger.info("finished initializing cubes, total = {}", allCubes.size());
}
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
// Before checking each polygon individually, first see if the
// polydata as a whole intersects the cube
BoundingBox cube = getCube(i);
if (cube.intersects(polydataBB)) {
public BoundingBox getCube(int cubeId) {
return allCubes.get(cubeId);
}
/**
* Get all the cubes that intersect with <tt>polydata</tt>
*
* @param polydata
* @return
*/
public TreeSet<Integer> getIntersectingCubes(vtkPolyData polydata) {
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
// Iterate through each cube and check if it intersects
// with the bounding box of any of the polygons of the polydata
BoundingBox polydataBB = new BoundingBox(polydata.GetBounds());
long numberPolygons = polydata.GetNumberOfCells();
// Store all the bounding boxes of all the individual polygons in an
// array first
// since the call to GetCellBounds is very slow.
double[] cellBounds = new double[6];
ArrayList<BoundingBox> polyCellsBB = new ArrayList<BoundingBox>();
for (int j = 0; j < numberPolygons; ++j) {
BoundingBox bb = polyCellsBB.get(j);
if (cube.intersects(bb)) {
cubeIds.add(i);
break;
}
polydata.GetCellBounds(j, cellBounds);
polyCellsBB.add(new BoundingBox(cellBounds));
}
}
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
// Before checking each polygon individually, first see if the
// polydata as a whole intersects the cube
BoundingBox cube = getCube(i);
if (cube.intersects(polydataBB)) {
for (int j = 0; j < numberPolygons; ++j) {
BoundingBox bb = polyCellsBB.get(j);
if (cube.intersects(bb)) {
cubeIds.add(i);
break;
}
}
}
}
return cubeIds;
}
return cubeIds;
}
/**
* Get all the cubes that intersect with BoundingBox <tt>bb</tt>
*
* @param bb
* @return
*/
public TreeSet<Integer> getIntersectingCubes(BoundingBox bb) {
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
/**
* Get all the cubes that intersect with BoundingBox <tt>bb</tt>
*
* @param bb
* @return
*/
public TreeSet<Integer> getIntersectingCubes(BoundingBox bb) {
TreeSet<Integer> cubeIds = new TreeSet<Integer>();
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
BoundingBox cube = getCube(i);
if (cube.intersects(bb)) {
cubeIds.add(i);
}
}
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
BoundingBox cube = getCube(i);
if (cube.intersects(bb)) {
cubeIds.add(i);
}
return cubeIds;
}
return cubeIds;
}
/**
* Get the id of the cube containing <tt>point</tt>
*
* @param point
* @return
*/
public int getCubeId(double[] point) {
if (!boundingBox.contains(point)) return -1;
/**
* Get the id of the cube containing <tt>point</tt>
*
* @param point
* @return
*/
public int getCubeId(double[] point) {
if (!boundingBox.contains(point))
return -1;
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
BoundingBox cube = getCube(i);
if (cube.contains(point)) return i;
}
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
BoundingBox cube = getCube(i);
if (cube.contains(point))
return i;
// If we reach here something is wrong
System.err.println("Error: could not find cube");
return -1;
}
// If we reach here something is wrong
System.err.println("Error: could not find cube");
return -1;
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -36,51 +36,49 @@ import org.apache.logging.log4j.Logger;
*/
public class DefaultTerrasaurTool implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private DefaultTerrasaurTool() {}
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private DefaultTerrasaurTool() {}
@Override
public String shortDescription() {
return "SHORT DESCRIPTION.";
}
@Override
public String shortDescription() {
return "SHORT DESCRIPTION.";
}
@Override
public String fullDescription(Options options) {
String header = "TEXT APPEARING BEFORE COMMAND LINE OPTION SUMMARY";
String footer = "\nTEXT APPENDED TO COMMAND LINE OPTION SUMMARY.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
@Override
public String fullDescription(Options options) {
String header = "TEXT APPEARING BEFORE COMMAND LINE OPTION SUMMARY";
String footer = "\nTEXT APPENDED TO COMMAND LINE OPTION SUMMARY.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("env")
.hasArgs()
.required()
.desc("Print the named environment variable's value. Can take multiple arguments.")
.build());
return options;
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("env")
.hasArgs()
.required()
.desc("Print the named environment variable's value. Can take multiple arguments.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new DefaultTerrasaurTool();
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new DefaultTerrasaurTool();
Options options = defineOptions();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
for (String env : cl.getOptionValues("env"))
logger.info(String.format("%s: %s", env, System.getenv(env)));
for (String env : cl.getOptionValues("env")) logger.info(String.format("%s: %s", env, System.getenv(env)));
logger.info("Finished");
}
logger.info("Finished");
}
}

View File

@@ -22,17 +22,16 @@
*/
package terrasaur.templates;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.Level;
import terrasaur.utils.AppVersion;
import terrasaur.utils.Log4j2Configurator;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.LinkedHashMap;
import java.util.Map;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.Level;
import terrasaur.utils.AppVersion;
import terrasaur.utils.Log4j2Configurator;
/**
* All classes in the apps folder should implement this interface. Calling the class without
@@ -43,188 +42,181 @@ import java.util.Map;
*/
public interface TerrasaurTool {
/** Show required options first, followed by non-required. */
class CustomHelpFormatter extends HelpFormatter {
public CustomHelpFormatter() {
setOptionComparator(
(o1, o2) -> {
/** Show required options first, followed by non-required. */
class CustomHelpFormatter extends HelpFormatter {
public CustomHelpFormatter() {
setOptionComparator((o1, o2) -> {
if (o1.isRequired() && !o2.isRequired()) return -1;
if (!o1.isRequired() && o2.isRequired()) return 1;
return o1.getKey().compareToIgnoreCase(o2.getKey());
});
}
}
/**
* @return One line description of this tool
*/
String shortDescription();
/**
* @param options command line options
* @param header String to print before argument list
* @param footer String to print after argument list
* @return Complete description of this tool.
*/
default String fullDescription(Options options, String header, String footer) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
pw.println(AppVersion.getFullString() + "\n");
HelpFormatter formatter = new CustomHelpFormatter();
formatter.printHelp(
pw,
formatter.getWidth(),
String.format("%s [options]", this.getClass().getSimpleName()),
header,
options,
formatter.getLeftPadding(),
formatter.getDescPadding(),
footer);
pw.flush();
return sw.toString();
}
/**
* @param options command line options
* @return Complete description of this tool.
*/
default String fullDescription(Options options) {
return fullDescription(options, "", "");
}
/**
* @param args arguments to parse
* @param options set of options accepted by the program
* @return command line formed by parsing arguments
*/
default CommandLine parseArgs(String[] args, Options options) {
// if no arguments, print the usage and exit
if (args.length == 0) {
System.out.println(fullDescription(options));
System.exit(0);
});
}
}
// if -shortDescription is specified, print short description and exit.
for (String arg : args) {
if (arg.equals("-shortDescription")) {
System.out.println(shortDescription());
System.exit(0);
}
/**
* @return One line description of this tool
*/
String shortDescription();
/**
* @param options command line options
* @param header String to print before argument list
* @param footer String to print after argument list
* @return Complete description of this tool.
*/
default String fullDescription(Options options, String header, String footer) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
pw.println(AppVersion.getFullString() + "\n");
HelpFormatter formatter = new CustomHelpFormatter();
formatter.printHelp(
pw,
formatter.getWidth(),
String.format("%s [options]", this.getClass().getSimpleName()),
header,
options,
formatter.getLeftPadding(),
formatter.getDescPadding(),
footer);
pw.flush();
return sw.toString();
}
// parse the arguments
CommandLine cl = null;
try {
cl = new DefaultParser().parse(options, args);
} catch (ParseException e) {
System.out.println(e.getMessage());
System.out.println(fullDescription(options));
System.exit(0);
/**
* @param options command line options
* @return Complete description of this tool.
*/
default String fullDescription(Options options) {
return fullDescription(options, "", "");
}
return cl;
}
/**
* @param args arguments to parse
* @param options set of options accepted by the program
* @return command line formed by parsing arguments
*/
default CommandLine parseArgs(String[] args, Options options) {
// if no arguments, print the usage and exit
if (args.length == 0) {
System.out.println(fullDescription(options));
System.exit(0);
}
/**
* @return options including -logFile and -logLevel
*/
static Options defineOptions() {
Options options = new Options();
options.addOption(
Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (Level l : Level.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
return options;
}
// if -shortDescription is specified, print short description and exit.
for (String arg : args) {
if (arg.equals("-shortDescription")) {
System.out.println(shortDescription());
System.exit(0);
}
}
/** Labels for startup messages. The enum order is the order in which they are printed. */
enum MessageLabel {
START("Start"),
ARGUMENTS("arguments");
public final String label;
// parse the arguments
CommandLine cl = null;
try {
cl = new DefaultParser().parse(options, args);
} catch (ParseException e) {
System.out.println(e.getMessage());
System.out.println(fullDescription(options));
System.exit(0);
}
MessageLabel(String label) {
this.label = label;
}
}
/**
* Generate startup messages. This is returned as a map. For example:
*
* <table>
* <tr>
* <th>Key</th>
* <th>Value</th>
* </tr>
* <tr>
* <td>
* Start
* </td>
* <td>
* MEGANESimulator [MMXTools version 25.01.28-b868ef6M] on nairah1-ml1
* </td>
* </tr>
* <tr>
* <td>arguments:</td>
* <td>-spice /project/sis/users/nairah1/MMX/spice/meganeLCp.mk -startTime 2026 JUN 20 00:00:00 -stopTime 2026 JUN 20 08:00:00 -delta 180 -outputCSV tmp.csv -numThreads 1 -obj /project/sis/users/nairah1/MMX/obj/Phobos-Ernst-800.obj -dbName tmp.db</td>
* </tr>
* </table>
*
* @param cl Command line object
* @return standard startup messages
*/
default Map<MessageLabel, String> startupMessages(CommandLine cl) {
Map<MessageLabel, String> startupMessages = new LinkedHashMap<>();
String hostname = "unknown host";
try {
hostname = InetAddress.getLocalHost().getHostName();
} catch (UnknownHostException ignored) {
return cl;
}
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel"))
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
StringBuilder sb =
new StringBuilder(
String.format(
"%s [%s] on %s",
getClass().getSimpleName(), AppVersion.getVersionString(), hostname));
startupMessages.put(MessageLabel.START, sb.toString());
sb = new StringBuilder();
for (Option option : cl.getOptions()) {
sb.append("-").append(option.getOpt()).append(" ");
if (option.hasArgs()) {
for (String arg : option.getValues()) sb.append(arg).append(" ");
} else if (option.hasArg()) {
sb.append(option.getValue()).append(" ");
}
}
for (String arg : cl.getArgs()) {
sb.append(arg).append(" ");
/**
* @return options including -logFile and -logLevel
*/
static Options defineOptions() {
Options options = new Options();
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (Level l : Level.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
return options;
}
startupMessages.put(MessageLabel.ARGUMENTS, sb.toString());
/** Labels for startup messages. The enum order is the order in which they are printed. */
enum MessageLabel {
START("Start"),
ARGUMENTS("arguments");
public final String label;
return startupMessages;
}
MessageLabel(String label) {
this.label = label;
}
}
/**
* Generate startup messages. This is returned as a map. For example:
*
* <table>
* <tr>
* <th>Key</th>
* <th>Value</th>
* </tr>
* <tr>
* <td>
* Start
* </td>
* <td>
* MEGANESimulator [MMXTools version 25.01.28-b868ef6M] on nairah1-ml1
* </td>
* </tr>
* <tr>
* <td>arguments:</td>
* <td>-spice /project/sis/users/nairah1/MMX/spice/meganeLCp.mk -startTime 2026 JUN 20 00:00:00 -stopTime 2026 JUN 20 08:00:00 -delta 180 -outputCSV tmp.csv -numThreads 1 -obj /project/sis/users/nairah1/MMX/obj/Phobos-Ernst-800.obj -dbName tmp.db</td>
* </tr>
* </table>
*
* @param cl Command line object
* @return standard startup messages
*/
default Map<MessageLabel, String> startupMessages(CommandLine cl) {
Map<MessageLabel, String> startupMessages = new LinkedHashMap<>();
String hostname = "unknown host";
try {
hostname = InetAddress.getLocalHost().getHostName();
} catch (UnknownHostException ignored) {
}
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel"))
lc.setLevel(
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
StringBuilder sb = new StringBuilder(
String.format("%s [%s] on %s", getClass().getSimpleName(), AppVersion.getVersionString(), hostname));
startupMessages.put(MessageLabel.START, sb.toString());
sb = new StringBuilder();
for (Option option : cl.getOptions()) {
sb.append("-").append(option.getOpt()).append(" ");
if (option.hasArgs()) {
for (String arg : option.getValues()) sb.append(arg).append(" ");
} else if (option.hasArg()) {
sb.append(option.getValue()).append(" ");
}
}
for (String arg : cl.getArgs()) {
sb.append(arg).append(" ");
}
startupMessages.put(MessageLabel.ARGUMENTS, sb.toString());
return startupMessages;
}
}

View File

@@ -24,26 +24,25 @@
package terrasaur.utils;
public class AppVersion {
public final static String lastCommit = "25.04.27";
public static final String lastCommit = "25.07.30";
// an M at the end of gitRevision means this was built from a "dirty" git repository
public final static String gitRevision = "cb0f7f8";
public final static String applicationName = "Terrasaur";
public final static String dateString = "2025-Apr-28 15:06:13 UTC";
public static final String gitRevision = "6212144";
public static final String applicationName = "Terrasaur";
public static final String dateString = "2025-Jul-30 16:05:45 UTC";
private AppVersion() {}
private AppVersion() {}
/**
* Terrasaur version 25.04.27-cb0f7f8 built 2025-Apr-28 15:06:13 UTC
* Terrasaur version 25.07.30-6212144 built 2025-Jul-30 16:05:45 UTC
*/
public static String getFullString() {
return String.format("%s version %s-%s built %s", applicationName, lastCommit, gitRevision, dateString);
return String.format("%s version %s-%s built %s", applicationName, lastCommit, gitRevision, dateString);
}
/**
* Terrasaur version 25.04.27-cb0f7f8
* Terrasaur version 25.07.30-6212144
*/
public static String getVersionString() {
return String.format("%s version %s-%s", applicationName, lastCommit, gitRevision);
return String.format("%s version %s-%s", applicationName, lastCommit, gitRevision);
}
}

View File

@@ -29,76 +29,78 @@ package terrasaur.utils;
* From <a href=
* "https://stackoverflow.com/questions/6162651/half-precision-floating-point-in-java/6162687">Stack
* Overflow</a>
*
*
* @author nairah1
*
*/
public class Binary16 {
/**
* Calculate a floating point value from the lower 16 bits of a 32 bit integer. The upper 16 bits
* are ignored.
*
* @param hbits
* @return
*/
// ignores the higher 16 bits
public static float toFloat(int hbits) {
int mant = hbits & 0x03ff; // 10 bits mantissa
int exp = hbits & 0x7c00; // 5 bits exponent
if (exp == 0x7c00) // NaN/Inf
exp = 0x3fc00; // -> NaN/Inf
else if (exp != 0) // normalized value
{
exp += 0x1c000; // exp - 15 + 127
if (mant == 0 && exp > 0x1c400) // smooth transition
return Float.intBitsToFloat((hbits & 0x8000) << 16 | exp << 13 | 0x3ff);
} else if (mant != 0) // && exp==0 -> subnormal
{
exp = 0x1c400; // make it normal
do {
mant <<= 1; // mantissa * 2
exp -= 0x400; // decrease exp by 1
} while ((mant & 0x400) == 0); // while not normal
mant &= 0x3ff; // discard subnormal bit
} // else +/-0 -> +/-0
return Float.intBitsToFloat( // combine all parts
(hbits & 0x8000) << 16 // sign << ( 31 - 15 )
| (exp | mant) << 13); // value << ( 23 - 10 )
}
/**
* Calculate a 16 bit representation of a floating point value. The upper 16 bits of the result
* are 0.
*
*
* @param fval
* @return
*/
// returns all higher 16 bits as 0 for all results
public static int fromFloat(float fval) {
int fbits = Float.floatToIntBits(fval);
int sign = fbits >>> 16 & 0x8000; // sign only
int val = (fbits & 0x7fffffff) + 0x1000; // rounded value
if (val >= 0x47800000) // might be or become NaN/Inf
{ // avoid Inf due to rounding
if ((fbits & 0x7fffffff) >= 0x47800000) { // is or must become NaN/Inf
if (val < 0x7f800000) // was value but too large
return sign | 0x7c00; // make it +/-Inf
return sign | 0x7c00 | // remains +/-Inf or NaN
(fbits & 0x007fffff) >>> 13; // keep NaN (and Inf) bits
}
return sign | 0x7bff; // unrounded not quite Inf
/**
* Calculate a floating point value from the lower 16 bits of a 32 bit integer. The upper 16 bits
* are ignored.
*
* @param hbits
* @return
*/
// ignores the higher 16 bits
public static float toFloat(int hbits) {
int mant = hbits & 0x03ff; // 10 bits mantissa
int exp = hbits & 0x7c00; // 5 bits exponent
if (exp == 0x7c00) // NaN/Inf
exp = 0x3fc00; // -> NaN/Inf
else if (exp != 0) // normalized value
{
exp += 0x1c000; // exp - 15 + 127
if (mant == 0 && exp > 0x1c400) // smooth transition
return Float.intBitsToFloat((hbits & 0x8000) << 16 | exp << 13 | 0x3ff);
} else if (mant != 0) // && exp==0 -> subnormal
{
exp = 0x1c400; // make it normal
do {
mant <<= 1; // mantissa * 2
exp -= 0x400; // decrease exp by 1
} while ((mant & 0x400) == 0); // while not normal
mant &= 0x3ff; // discard subnormal bit
} // else +/-0 -> +/-0
return Float.intBitsToFloat( // combine all parts
(hbits & 0x8000) << 16 // sign << ( 31 - 15 )
| (exp | mant) << 13); // value << ( 23 - 10 )
}
if (val >= 0x38800000) // remains normalized value
return sign | val - 0x38000000 >>> 13; // exp - 127 + 15
if (val < 0x33000000) // too small for subnormal
return sign; // becomes +/-0
val = (fbits & 0x7fffffff) >>> 23; // tmp exp for subnormal calc
return sign | ((fbits & 0x7fffff | 0x800000) // add subnormal bit
+ (0x800000 >>> val - 102) // round depending on cut off
>>> 126 - val); // div by 2^(1-(exp-127+15)) and >> 13 | exp=0
}
/**
* Calculate a 16 bit representation of a floating point value. The upper 16 bits of the result
* are 0.
*
*
* @param fval
* @return
*/
// returns all higher 16 bits as 0 for all results
public static int fromFloat(float fval) {
int fbits = Float.floatToIntBits(fval);
int sign = fbits >>> 16 & 0x8000; // sign only
int val = (fbits & 0x7fffffff) + 0x1000; // rounded value
if (val >= 0x47800000) // might be or become NaN/Inf
{ // avoid Inf due to rounding
if ((fbits & 0x7fffffff) >= 0x47800000) { // is or must become NaN/Inf
if (val < 0x7f800000) // was value but too large
return sign | 0x7c00; // make it +/-Inf
return sign
| 0x7c00
| // remains +/-Inf or NaN
(fbits & 0x007fffff) >>> 13; // keep NaN (and Inf) bits
}
return sign | 0x7bff; // unrounded not quite Inf
}
if (val >= 0x38800000) // remains normalized value
return sign | val - 0x38000000 >>> 13; // exp - 127 + 15
if (val < 0x33000000) // too small for subnormal
return sign; // becomes +/-0
val = (fbits & 0x7fffffff) >>> 23; // tmp exp for subnormal calc
return sign
| ((fbits & 0x7fffff | 0x800000) // add subnormal bit
+ (0x800000 >>> val - 102) // round depending on cut off
>>> 126 - val); // div by 2^(1-(exp-127+15)) and >> 13 | exp=0
}
}

Some files were not shown because too many files have changed in this diff Show More