Merge pull request #6 from JHUAPL/5-update-to-v110

merge 5 update to v110
This commit is contained in:
Hari Nair
2025-07-30 12:13:27 -04:00
committed by GitHub
226 changed files with 36729 additions and 36147 deletions

View File

@@ -1,5 +1,20 @@
# Terrasaur Changelog # Terrasaur Changelog
## July 30, 2025 - v1.1.0
- Updates to existing tools
- AdjustShapeModelToOtherShapeModel
- fix intersection bug
- CreateSBMTStructure
- new options: -flipX, -flipY, -spice, -date, -observer, -target, -cameraFrame
- ValidateNormals
- new option: -fast to only check for overhangs if center and normal point in opposite directions
- New tools:
- FacetInfo: Print info about a facet
- PointCloudOverlap: Find points in a point cloud which overlap a reference point cloud
- TriAx: Generate a triaxial ellipsoid in ICQ format
## April 28, 2025 - v1.0.1 ## April 28, 2025 - v1.0.1
- Add MIT license to repository and source code - Add MIT license to repository and source code

Binary file not shown.

Binary file not shown.

View File

@@ -16,7 +16,7 @@ The Terrasaur package requires Java 21 or later. Some freely available versions
Download Download
~~~~~~~~ ~~~~~~~~
Binary packages for use on Mac OS X and Linux are available at ... Binary packages for use on Mac OS X and Linux are available at `GitHub <https://github.com/JHUAPL/Terrasaur/releases>`__.
We have not tried using the softare on Microsoft Windows, but users may try the Linux package with the `Windows Subsystem for Linux <https://docs.microsoft.com/en-us/windows/wsl/>`__. We have not tried using the softare on Microsoft Windows, but users may try the Linux package with the `Windows Subsystem for Linux <https://docs.microsoft.com/en-us/windows/wsl/>`__.

70
doc/tools/ColorSpots.rst Normal file
View File

@@ -0,0 +1,70 @@
.. _ColorSpots:
##########
ColorSpots
##########
ColorSpots takes as input a shape model and a file containing (x, y, z, value),
or (lat, lon, value). It writes out the mean and standard deviation of values
within a specified range for each facet.
.. include:: ../toolDescriptions/ColorSpots.txt
:literal:
********
Examples
********
Download the :download:`Apophis<./support_files/apophis_g_15618mm_rdr_obj_0000n00000_v001.obj>`
shape model and the :download:`info<./support_files/xyzrandom.txt>` file containing
cartesian coordinates and a random value.
Run ColorSpots:
::
ColorSpots -obj apophis_g_15618mm_rdr_obj_0000n00000_v001.obj -xyz \
-info xyzrandom.txt -outFile apophis_value_at_vertex.csv -noWeight \
-allFacets -additionalFields n -searchRadius 0.015 -writeVertices
The first few lines of apophis_value_at_vertex.csv look like:
::
% head apophis_value_at_vertex.csv
0.000000e+00, 0.000000e+00, 1.664960e-01, -3.805764e-02, 5.342315e-01, 4.000000e+01
1.589500e-02, 0.000000e+00, 1.591030e-01, 6.122849e-02, 6.017192e-01, 5.000000e+01
7.837000e-03, 1.486800e-02, 1.591670e-01, -6.072964e-03, 5.220682e-01, 5.700000e+01
-7.747000e-03, 1.506300e-02, 1.621040e-01, 9.146163e-02, 5.488631e-01, 4.900000e+01
-1.554900e-02, 0.000000e+00, 1.657970e-01, -8.172811e-03, 5.270302e-01, 3.400000e+01
-7.982000e-03, -1.571100e-02, 1.694510e-01, -2.840524e-02, 5.045911e-01, 3.900000e+01
8.060000e-03, -1.543300e-02, 1.655150e-01, 3.531959e-02, 5.464390e-01, 4.900000e+01
3.179500e-02, 0.000000e+00, 1.515820e-01, -1.472434e-02, 5.967265e-01, 5.400000e+01
2.719700e-02, 1.658200e-02, 1.508930e-01, -9.050683e-03, 5.186966e-01, 4.700000e+01
1.554100e-02, 2.901300e-02, 1.530770e-01, -7.053547e-02, 4.980369e-01, 7.000000e+01
The columns are:
.. list-table:: ColorSpots Vertex output
:header-rows: 1
* - Column
- Value
* - 1
- X
* - 2
- Y
* - 3
- Z
* - 4
- mean value in region
* - 5
- standard deviation in region
* - 6
- number of points in region
.. figure:: images/ColorSpots-n.png
:alt: Number of points in region at each vertex
This image shows the number of points in the region at each vertex.

View File

@@ -23,7 +23,7 @@ Local Model Comparison
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>` Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
shape models. You can view them in a tool such as shape models. You can view them in a tool such as
`ParaView<https://www.paraview.org/>`. `ParaView <https://www.paraview.org/>`__.
.. figure:: images/CompareOBJ_local_1.png .. figure:: images/CompareOBJ_local_1.png
@@ -32,8 +32,8 @@ shape models. You can view them in a tool such as
Run CompareOBJ to find the optimal transform to align the comparison with the reference: Run CompareOBJ to find the optimal transform to align the comparison with the reference:
:: ::
CompareOBJ -computeOptimalRotationAndTranslation -model F3H-1/EVAL20.obj \ CompareOBJ -computeOptimalRotationAndTranslation -model EVAL20.obj \
-reference F3H-1/EVAL20_wtr.obj -computeVerticalError verticalError.txt \ -reference EVAL20_wtr.obj -computeVerticalError verticalError.txt \
-saveOptimalShape optimal.obj -savePlateDiff plateDiff.txt -savePlateIndex plateIndex.txt -saveOptimalShape optimal.obj -savePlateDiff plateDiff.txt -savePlateIndex plateIndex.txt
The screen output is The screen output is
@@ -77,7 +77,7 @@ model for comparison:
:: ::
ShapeFormatConverter -input Bennu/Bennu49k.obj -output BennuComparison.obj \ ShapeFormatConverter -input Bennu49k.obj -output BennuComparison.obj \
-rotate 5,0,0,1 -translate 0.01,-0.01,0.01 -rotate 5,0,0,1 -translate 0.01,-0.01,0.01
This rotates the shape model by 5 degrees about the z axis and then translates This rotates the shape model by 5 degrees about the z axis and then translates
@@ -94,11 +94,11 @@ Run CompareOBJ to find the optimal transform to align the comparison with the re
CompareOBJ -computeOptimalRotationAndTranslation \ CompareOBJ -computeOptimalRotationAndTranslation \
-model BennuComparison.obj \ -model BennuComparison.obj \
-reference Bennu/Bennu49k.obj \ -reference Bennu49k.obj \
-computeVerticalError CompareOBJ/terrasaur-verticalError.txt \ -computeVerticalError terrasaur-verticalError.txt \
-saveOptimalShape CompareOBJ/terrasaur-optimal.obj \ -saveOptimalShape terrasaur-optimal.obj \
-savePlateDiff CompareOBJ/terrasaur-plateDiff.txt \ -savePlateDiff terrasaur-plateDiff.txt \
-savePlateIndex CompareOBJ/terrasaur-plateIndex.txt -savePlateIndex terrasaur-plateIndex.txt
The screen output is The screen output is

View File

@@ -0,0 +1,38 @@
.. _PointCloudOverlap:
#################
PointCloudOverlap
#################
*****
Usage
*****
.. include:: ../toolDescriptions/PointCloudOverlap.txt
:literal:
********
Examples
********
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
shape models. You can view them in a tool such as
`ParaView <https://www.paraview.org/>`__.
.. figure:: images/PointCloudOverlap_1.png
This image shows the reference (pink) and input (blue) shape models.
Run PointCloudOverlap:
::
PointCloudOverlap -inputFile EVAL20.obj -referenceFile EVAL20_wtr.obj -outputFile overlap.vtk
Note that OBJ is supported as an input format but not as an output format.
.. figure:: images/PointCloudOverlap_2.png
The points in white are those in the input model that overlap the reference.

43
doc/tools/TriAx.rst Normal file
View File

@@ -0,0 +1,43 @@
.. _TriAx:
=====
TriAx
=====
TriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.
*****
Usage
*****
.. include:: ../toolDescriptions/TriAx.txt
:literal:
*******
Example
*******
Generate an ellipsoid with dimensions 10, 8, 6, with q = 8.
::
TriAx -A 10 -B 8 -C 6 -Q 8 -output triax.icq -saveOBJ
The following ellipsoid is generated:
.. container:: figures-row
.. figure:: images/TriAx_X.png
:alt: looking down from the +X direction
looking down from the +X direction
.. figure:: images/TriAx_Y.png
:alt: looking down from the +Y direction
looking down from the +Y direction
.. figure:: images/TriAx_Z.png
:alt: looking down from the +Z direction
looking down from the +Z direction

Binary file not shown.

After

Width:  |  Height:  |  Size: 174 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 139 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

View File

@@ -89,7 +89,7 @@ function build_jar() {
function make_scripts() { function make_scripts() {
classes=$(jar tf "${scriptPath}"/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep class) classes=$(jar tf ${scriptPath}/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep -v "Immutable" | grep class)
for class in $classes; do for class in $classes; do
base=$(basename "$class" ".class") base=$(basename "$class" ".class")

50
pom.xml
View File

@@ -1,5 +1,6 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" <project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>edu.jhuapl.ses.srn</groupId> <groupId>edu.jhuapl.ses.srn</groupId>
<artifactId>Terrasaur</artifactId> <artifactId>Terrasaur</artifactId>
@@ -42,7 +43,7 @@
<maven.compiler.release>21</maven.compiler.release> <maven.compiler.release>21</maven.compiler.release>
<javafx.version>21.0.5</javafx.version> <javafx.version>21.0.5</javafx.version>
<immutables.version>2.10.1</immutables.version> <immutables.version>2.11.1</immutables.version>
<jackfruit.version>1.1.1</jackfruit.version> <jackfruit.version>1.1.1</jackfruit.version>
</properties> </properties>
@@ -62,6 +63,10 @@
<includes> <includes>
<include>**/*.java</include> <include>**/*.java</include>
</includes> </includes>
<excludes>
<exclude>3rd-party/**/*.java</exclude>
<exclude>support-libraries/3rd-party/**/*.java</exclude>
</excludes>
</licenseSet> </licenseSet>
</licenseSets> </licenseSets>
</configuration> </configuration>
@@ -126,14 +131,15 @@
<artifactId>maven-surefire-plugin</artifactId> <artifactId>maven-surefire-plugin</artifactId>
<version>3.5.3</version> <version>3.5.3</version>
<configuration> <configuration>
<argLine> -Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib <argLine>
-Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
</argLine> </argLine>
</configuration> </configuration>
</plugin> </plugin>
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId> <artifactId>maven-enforcer-plugin</artifactId>
<version>3.5.0</version> <version>3.6.1</version>
<executions> <executions>
<execution> <execution>
<id>enforce-maven</id> <id>enforce-maven</id>
@@ -207,7 +213,7 @@
<plugin> <plugin>
<groupId>org.codehaus.mojo</groupId> <groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId> <artifactId>exec-maven-plugin</artifactId>
<version>3.5.0</version> <version>3.5.1</version>
<executions> <executions>
<execution> <execution>
<phase>generate-sources</phase> <phase>generate-sources</phase>
@@ -222,6 +228,18 @@
</execution> </execution>
</executions> </executions>
</plugin> </plugin>
<plugin>
<groupId>com.diffplug.spotless</groupId>
<artifactId>spotless-maven-plugin</artifactId>
<version>2.46.1</version>
<configuration>
<java>
<palantirJavaFormat />
</java>
</configuration>
</plugin>
</plugins> </plugins>
</build> </build>
<dependencies> <dependencies>
@@ -264,7 +282,7 @@
<dependency> <dependency>
<groupId>commons-beanutils</groupId> <groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId> <artifactId>commons-beanutils</artifactId>
<version>1.10.0</version> <version>1.11.0</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>commons-cli</groupId> <groupId>commons-cli</groupId>
@@ -274,17 +292,22 @@
<dependency> <dependency>
<groupId>org.apache.commons</groupId> <groupId>org.apache.commons</groupId>
<artifactId>commons-configuration2</artifactId> <artifactId>commons-configuration2</artifactId>
<version>2.11.0</version> <version>2.12.0</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.commons</groupId> <groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId> <artifactId>commons-csv</artifactId>
<version>1.13.0</version> <version>1.14.1</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>commons-io</groupId> <groupId>commons-io</groupId>
<artifactId>commons-io</artifactId> <artifactId>commons-io</artifactId>
<version>2.18.0</version> <version>2.20.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.14.0</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.beust</groupId> <groupId>com.beust</groupId>
@@ -294,7 +317,7 @@
<dependency> <dependency>
<groupId>com.google.code.gson</groupId> <groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId> <artifactId>gson</artifactId>
<version>2.12.1</version> <version>2.13.1</version>
</dependency> </dependency>
<dependency> <dependency>
@@ -315,18 +338,13 @@
<dependency> <dependency>
<groupId>gov.nasa.gsfc.heasarc</groupId> <groupId>gov.nasa.gsfc.heasarc</groupId>
<artifactId>nom-tam-fits</artifactId> <artifactId>nom-tam-fits</artifactId>
<version>1.20.2</version> <version>1.21.1</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>gov.nasa.jpl.naif</groupId> <groupId>gov.nasa.jpl.naif</groupId>
<artifactId>spice</artifactId> <artifactId>spice</artifactId>
<version>N0067</version> <version>N0067</version>
</dependency> </dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.13.0</version>
</dependency>
<dependency> <dependency>
<groupId>org.apache.logging.log4j</groupId> <groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId> <artifactId>log4j-api</artifactId>

View File

@@ -47,8 +47,7 @@ public class ALTWGProductNamer implements ProductNamer {
String[] fields = productName.split("_"); String[] fields = productName.split("_");
String returnField = "ERROR"; String returnField = "ERROR";
if (fieldNum > fields.length) { if (fieldNum > fields.length) {
System.out.println( System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField); System.out.println("returning:" + returnField);
} else { } else {
returnField = fields[fieldNum]; returnField = fields[fieldNum];
@@ -57,8 +56,7 @@ public class ALTWGProductNamer implements ProductNamer {
} }
@Override @Override
public String productbaseName( public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
String gsd = "gsd"; String gsd = "gsd";
String dataSrc = "dataSrc"; String dataSrc = "dataSrc";
@@ -126,8 +124,7 @@ public class ALTWGProductNamer implements ProductNamer {
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon); String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
// pds likes having filenames all in the same case, so chose lowercase // pds likes having filenames all in the same case, so chose lowercase
String outFile = String outFile = ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
return outFile; return outFile;
} }
@@ -167,8 +164,7 @@ public class ALTWGProductNamer implements ProductNamer {
if (dataSource.length() > 3) { if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!"); System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3); dataSource = dataSource.substring(0, 3);
System.out.println( System.out.println("Will set data source to:"
"Will set data source to:"
+ dataSource + dataSource
+ " but" + " but"
+ " this might NOT conform to the ALTWG naming convention!"); + " this might NOT conform to the ALTWG naming convention!");
@@ -314,8 +310,7 @@ public class ALTWGProductNamer implements ProductNamer {
} }
if (Double.isNaN(gsdD)) { if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999 // still cannot parse gsd. Set to -999
System.out.println( System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999"); System.out.println("Setting gsd = -999");
gsdD = -999D; gsdD = -999D;
} }

View File

@@ -50,8 +50,7 @@ public class AltwgMLNNamer implements ProductNamer {
* @param isGlobal - N/A. Included here as part of the interface structure. * @param isGlobal - N/A. Included here as part of the interface structure.
*/ */
@Override @Override
public String productbaseName( public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
// initialize string fragments for NFT name. This will help identify // initialize string fragments for NFT name. This will help identify
// which string fragments have not been updated by the method. // which string fragments have not been updated by the method.
@@ -187,8 +186,7 @@ public class AltwgMLNNamer implements ProductNamer {
System.out.println("file units:" + fileUnits); System.out.println("file units:" + fileUnits);
} else { } else {
String errMesg = String errMesg = "ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
"ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }

View File

@@ -54,8 +54,7 @@ public class DartNamer implements ProductNamer {
if (dataSource.length() > 3) { if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!"); System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3); dataSource = dataSource.substring(0, 3);
System.out.println( System.out.println("Will set data source to:"
"Will set data source to:"
+ dataSource + dataSource
+ " but" + " but"
+ " this might NOT conform to the ALTWG naming convention!"); + " this might NOT conform to the ALTWG naming convention!");
@@ -96,8 +95,7 @@ public class DartNamer implements ProductNamer {
String[] fields = productName.split("_"); String[] fields = productName.split("_");
String returnField = "ERROR"; String returnField = "ERROR";
if (fieldNum > fields.length) { if (fieldNum > fields.length) {
System.out.println( System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField); System.out.println("returning:" + returnField);
} else { } else {
returnField = fields[fieldNum]; returnField = fields[fieldNum];
@@ -106,8 +104,7 @@ public class DartNamer implements ProductNamer {
} }
@Override @Override
public String productbaseName( public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
String gsd = "gsd"; String gsd = "gsd";
String dataSrc = "dataSrc"; String dataSrc = "dataSrc";
@@ -249,8 +246,7 @@ public class DartNamer implements ProductNamer {
} }
if (Double.isNaN(gsdD)) { if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999 // still cannot parse gsd. Set to -999
System.out.println( System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999"); System.out.println("Setting gsd = -999");
gsdD = -999D; gsdD = -999D;
} }

View File

@@ -29,8 +29,11 @@ package terrasaur.altwg.pipeline;
* *
*/ */
public enum NameConvention { public enum NameConvention {
ALTPRODUCT,
ALTPRODUCT, ALTNFTMLN, DARTPRODUCT, NOMATCH, NONEUSED; ALTNFTMLN,
DARTPRODUCT,
NOMATCH,
NONEUSED;
public static NameConvention parseNameConvention(String name) { public static NameConvention parseNameConvention(String name) {
for (NameConvention nameConvention : values()) { for (NameConvention nameConvention : values()) {
@@ -40,9 +43,8 @@ public enum NameConvention {
} }
} }
NameConvention nameConvention = NameConvention.NOMATCH; NameConvention nameConvention = NameConvention.NOMATCH;
System.out System.out.println("NameConvention.parseNameConvention()" + " could not parse naming convention:" + name
.println("NameConvention.parseNameConvention()" + " could not parse naming convention:" + ". Returning:" + nameConvention.toString());
+ name + ". Returning:" + nameConvention.toString());
return nameConvention; return nameConvention;
} }
} }

View File

@@ -30,6 +30,11 @@ package terrasaur.altwg.pipeline;
* *
*/ */
public enum NameFields { public enum NameFields {
GSD,
GSD, DATATYPE, VERSION, DATASRC, CLATLON, REGION, TBODY; DATATYPE,
VERSION,
DATASRC,
CLATLON,
REGION,
TBODY;
} }

View File

@@ -48,8 +48,7 @@ public class NamingFactory {
return new DartNamer(); return new DartNamer();
default: default:
System.err.println( System.err.println("ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
"ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
throw new RuntimeException(); throw new RuntimeException();
} }
} }
@@ -60,8 +59,7 @@ public class NamingFactory {
* @param pipeConfig * @param pipeConfig
* @return * @return
*/ */
public static ProductNamer parseNamingConvention( public static ProductNamer parseNamingConvention(Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
ProductNamer productNamer = null; ProductNamer productNamer = null;

View File

@@ -29,8 +29,7 @@ public interface ProductNamer {
public String getNameFrag(String productName, int fieldNum); public String getNameFrag(String productName, int fieldNum);
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal);
boolean isGlobal);
public String getVersion(FitsHdrBuilder hdrBuilder); public String getVersion(FitsHdrBuilder hdrBuilder);
@@ -39,5 +38,4 @@ public interface ProductNamer {
public NameConvention getNameConvention(); public NameConvention getNameConvention();
public double gsdFromFilename(String filename); public double gsdFromFilename(String filename);
} }

View File

@@ -25,7 +25,6 @@ package terrasaur.apps;
import java.io.File; import java.io.File;
import java.nio.charset.Charset; import java.nio.charset.Charset;
import java.util.*; import java.util.*;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option; import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;
@@ -33,14 +32,14 @@ import org.apache.commons.io.FileUtils;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D; import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import spice.basic.Plane;
import spice.basic.Vector3;
import terrasaur.smallBodyModel.BoundingBox; import terrasaur.smallBodyModel.BoundingBox;
import terrasaur.smallBodyModel.SmallBodyModel; import terrasaur.smallBodyModel.SmallBodyModel;
import terrasaur.templates.TerrasaurTool; import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.Log4j2Configurator; import terrasaur.utils.Log4j2Configurator;
import terrasaur.utils.NativeLibraryLoader; import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataUtil; import terrasaur.utils.PolyDataUtil;
import spice.basic.Plane;
import spice.basic.Vector3;
import vtk.vtkGenericCell; import vtk.vtkGenericCell;
import vtk.vtkPoints; import vtk.vtkPoints;
import vtk.vtkPolyData; import vtk.vtkPolyData;
@@ -78,26 +77,20 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("from")
Option.builder("from")
.hasArg() .hasArg()
.desc( .desc("path to first shape model in OBJ format which will get shifted to the second shape model")
"path to first shape model in OBJ format which will get shifted to the second shape model")
.build()); .build());
options.addOption( options.addOption(Option.builder("to")
Option.builder("to")
.hasArg() .hasArg()
.desc( .desc("path to second shape model in OBJ format which the first shape model will try to match to")
"path to second shape model in OBJ format which the first shape model will try to match to")
.build()); .build());
options.addOption( options.addOption(Option.builder("output")
Option.builder("output")
.hasArg() .hasArg()
.desc( .desc(
"path to adjusted shape model in OBJ format generated by this program by shifting first to second") "path to adjusted shape model in OBJ format generated by this program by shifting first to second")
.build()); .build());
options.addOption( options.addOption(Option.builder("filelist")
Option.builder("filelist")
.desc( .desc(
""" """
If specified then the second required argument to this program, If specified then the second required argument to this program,
@@ -109,8 +102,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
However, the global shape model formed when all these pieces are However, the global shape model formed when all these pieces are
combined together, may not have any holes or gaps.""") combined together, may not have any holes or gaps.""")
.build()); .build());
options.addOption( options.addOption(Option.builder("fit-plane-radius")
Option.builder("fit-plane-radius")
.hasArg() .hasArg()
.desc( .desc(
""" """
@@ -119,8 +111,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
Use this normal to adjust the point to the second shape model rather Use this normal to adjust the point to the second shape model rather
than the radial vector.""") than the radial vector.""")
.build()); .build());
options.addOption( options.addOption(Option.builder("local")
Option.builder("local")
.desc( .desc(
""" """
Use when adjusting a local OBJ file to another. The best fit plane to the Use when adjusting a local OBJ file to another. The best fit plane to the
@@ -139,10 +130,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
} }
public static void adjustShapeModelToOtherShapeModel( public static void adjustShapeModelToOtherShapeModel(
vtkPolyData frompolydata, vtkPolyData frompolydata, ArrayList<vtkPolyData> topolydata, double planeRadius, boolean localModel)
ArrayList<vtkPolyData> topolydata,
double planeRadius,
boolean localModel)
throws Exception { throws Exception {
vtkPoints points = frompolydata.GetPoints(); vtkPoints points = frompolydata.GetPoints();
long numberPoints = frompolydata.GetNumberOfPoints(); long numberPoints = frompolydata.GetNumberOfPoints();
@@ -173,8 +161,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
// fit a plane to the local model and check that the normal points outward // fit a plane to the local model and check that the normal points outward
Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata); Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata);
Vector3 localNormalVector = localPlane.getNormal(); Vector3 localNormalVector = localPlane.getNormal();
if (localNormalVector.dot(localPlane.getPoint()) < 0) if (localNormalVector.dot(localPlane.getPoint()) < 0) localNormalVector = localNormalVector.negate();
localNormalVector = localNormalVector.negate();
localNormal = localNormalVector.toArray(); localNormal = localNormalVector.toArray();
} }
@@ -182,6 +169,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
Vector3D origin = new Vector3D(0., 0., 0.); Vector3D origin = new Vector3D(0., 0., 0.);
for (int i = 0; i < numberPoints; ++i) { for (int i = 0; i < numberPoints; ++i) {
points.GetPoint(i, p); points.GetPoint(i, p);
Vector3D thisPoint = new Vector3D(p);
Vector3D lookDir; Vector3D lookDir;
@@ -198,59 +186,37 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
} }
Vector3D lookPt = lookDir.scalarMultiply(diagonalLength); Vector3D lookPt = lookDir.scalarMultiply(diagonalLength);
lookPt = lookPt.add(origin); lookPt = lookPt.add(thisPoint);
List<Vector3D> intersections = new ArrayList<>(); List<Vector3D> intersections = new ArrayList<>();
for (vtksbCellLocator cellLocator : cellLocators) { for (vtksbCellLocator cellLocator : cellLocators) {
double[] intersectPoint = new double[3]; double[] intersectPoint = new double[3];
// trace ray from the lookPt to the origin - first intersection is the farthest intersection // trace ray from thisPoint to the lookPt - Assume cell intersection is the closest one if
// from the origin // there are multiple?
int result =
cellLocator.IntersectWithLine(
lookPt.toArray(),
origin.toArray(),
tol,
t,
intersectPoint,
pcoords,
subId,
cell_id,
cell);
Vector3D intersectVector = new Vector3D(intersectPoint);
if (fitPlane || localModel) {
// NOTE: result should return 1 in case of intersection but doesn't sometimes. // NOTE: result should return 1 in case of intersection but doesn't sometimes.
// Use the norm of intersection point to test for intersection instead. // Use the norm of intersection point to test for intersection instead.
int result = cellLocator.IntersectWithLine(
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
Vector3D intersectVector = new Vector3D(intersectPoint);
NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>(); NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>();
if (intersectVector.getNorm() > 0) { if (intersectVector.getNorm() > 0) {
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector); pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
} }
// look in the other direction
lookPt = lookDir.scalarMultiply(-diagonalLength); lookPt = lookDir.scalarMultiply(-diagonalLength);
lookPt = lookPt.add(origin); lookPt = lookPt.add(thisPoint);
result = result = cellLocator.IntersectWithLine(
cellLocator.IntersectWithLine( thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
lookPt.toArray(),
origin.toArray(),
tol,
t,
intersectPoint,
pcoords,
subId,
cell_id,
cell);
intersectVector = new Vector3D(intersectPoint); intersectVector = new Vector3D(intersectPoint);
if (intersectVector.getNorm() > 0) { if (intersectVector.getNorm() > 0) {
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector); pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
} }
if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey())); if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey()));
} else {
if (result > 0) intersections.add(intersectVector);
}
} }
if (intersections.isEmpty()) throw new Exception("Error: no intersections at all"); if (intersections.isEmpty()) throw new Exception("Error: no intersections at all");

View File

@@ -59,48 +59,45 @@ public class AppendOBJ implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("boundary")
Option.builder("boundary")
.desc("Only save out boundary. This option implies -vtk.") .desc("Only save out boundary. This option implies -vtk.")
.build()); .build());
options.addOption( options.addOption(Option.builder("decimate")
Option.builder("decimate")
.hasArg() .hasArg()
.desc( .desc(
"Reduce the number of facets in the output shape model. The argument should be between 0 and 1. " "Reduce the number of facets in the output shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and <arg> is 0.90, " + "For example, if a model has 100 facets and <arg> is 0.90, "
+ "there will be approximately 10 facets after the decimation.") + "there will be approximately 10 facets after the decimation.")
.build()); .build());
options.addOption( options.addOption(Option.builder("input")
Option.builder("input")
.required() .required()
.hasArgs() .hasArgs()
.desc( .desc("input file(s) to read. Format is derived from the allowed extension: "
"input file(s) to read. Format is derived from the allowed extension: "
+ "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified " + "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified "
+ "with a single -input option, separated by whitespace. Alternatively, you may " + "with a single -input option, separated by whitespace. Alternatively, you may "
+ "specify -input multiple times.") + "specify -input multiple times.")
.build()); .build());
options.addOption( options.addOption(Option.builder("output")
Option.builder("output").required().hasArg().desc("output file to write.").build()); .required()
options.addOption( .hasArg()
Option.builder("vtk").desc("Save output file in VTK format rather than OBJ.").build()); .desc("output file to write.")
.build());
options.addOption(Option.builder("vtk")
.desc("Save output file in VTK format rather than OBJ.")
.build());
return options; return options;
} }
@@ -119,8 +116,7 @@ public class AppendOBJ implements TerrasaurTool {
boolean boundaryOnly = cl.hasOption("boundary"); boolean boundaryOnly = cl.hasOption("boundary");
boolean vtkFormat = boundaryOnly || cl.hasOption("vtk"); boolean vtkFormat = boundaryOnly || cl.hasOption("vtk");
boolean decimate = cl.hasOption("decimate"); boolean decimate = cl.hasOption("decimate");
double decimationPercentage = double decimationPercentage = decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();

View File

@@ -41,7 +41,6 @@ public class BatchSubmit implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger(); private static final Logger logger = LogManager.getLogger();
@Override @Override
public String shortDescription() { public String shortDescription() {
return "Run a command on a cluster."; return "Run a command on a cluster.";
@@ -53,13 +52,11 @@ public class BatchSubmit implements TerrasaurTool {
String footer = "\nRun a command on a cluster.\n"; String footer = "\nRun a command on a cluster.\n";
return TerrasaurTool.super.fullDescription(options, "", footer); return TerrasaurTool.super.fullDescription(options, "", footer);
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("command")
Option.builder("command")
.required() .required()
.hasArgs() .hasArgs()
.desc("Required. Command(s) to run.") .desc("Required. Command(s) to run.")
@@ -67,14 +64,12 @@ public class BatchSubmit implements TerrasaurTool {
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name())); for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name()));
options.addOption( options.addOption(Option.builder("gridType")
Option.builder("gridType")
.hasArg() .hasArg()
.desc("Grid type. Valid values are " + sb + ". Default is LOCAL.") .desc("Grid type. Valid values are " + sb + ". Default is LOCAL.")
.build()); .build());
options.addOption( options.addOption(Option.builder("workingDir")
Option.builder("workingDir")
.hasArg() .hasArg()
.desc("Working directory to run command. Default is current directory.") .desc("Working directory to run command. Default is current directory.")
.build()); .build());
@@ -95,8 +90,7 @@ public class BatchSubmit implements TerrasaurTool {
List<String> cmdList = Arrays.asList(cl.getOptionValues("command")); List<String> cmdList = Arrays.asList(cl.getOptionValues("command"));
BatchType batchType = BatchType.GRID_ENGINE; BatchType batchType = BatchType.GRID_ENGINE;
GridType gridType = GridType gridType = cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType); BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType);
String workingDir = ""; String workingDir = "";
@@ -106,5 +100,4 @@ public class BatchSubmit implements TerrasaurTool {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
} }
} }
} }

View File

@@ -64,7 +64,7 @@ import terrasaur.utils.math.MathConversions;
public class CKFromSumFile implements TerrasaurTool { public class CKFromSumFile implements TerrasaurTool {
private final static Logger logger = LogManager.getLogger(); private static final Logger logger = LogManager.getLogger();
@Override @Override
public String shortDescription() { public String shortDescription() {
@@ -79,20 +79,31 @@ public class CKFromSumFile implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("config").required().hasArg() options.addOption(Option.builder("config")
.desc("Required. Name of configuration file.").build()); .required()
options.addOption(Option.builder("dumpConfig").hasArg() .hasArg()
.desc("Write out an example configuration to the named file.").build()); .desc("Required. Name of configuration file.")
options.addOption(Option.builder("logFile").hasArg() .build());
.desc("If present, save screen output to log file.").build()); options.addOption(Option.builder("dumpConfig")
.hasArg()
.desc("Write out an example configuration to the named file.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
sb.append(String.format("%s ", l.name())); options.addOption(Option.builder("logLevel")
options.addOption(Option.builder("logLevel").hasArg() .hasArg()
.desc("If present, print messages above selected priority. Valid values are " .desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.") + sb.toString().trim() + ". Default is INFO.")
.build()); .build());
options.addOption(Option.builder("sumFile").hasArg().required().desc(""" options.addOption(Option.builder("sumFile")
.hasArg()
.required()
.desc(
"""
Required. File listing sumfiles to read. This is a text file, one per line. Required. File listing sumfiles to read. This is a text file, one per line.
Lines starting with # are ignored. Lines starting with # are ignored.
@@ -106,14 +117,18 @@ public class CKFromSumFile implements TerrasaurTool {
D717506131G0.SUM D717506131G0.SUM
# This is a comment # This is a comment
D717506132G0.SUM D717506132G0.SUM
""").build()); """)
.build());
return options; return options;
} }
private final CKFromSumFileConfig config; private final CKFromSumFileConfig config;
private final NavigableMap<SumFile, String> sumFiles; private final NavigableMap<SumFile, String> sumFiles;
private CKFromSumFile(){config=null;sumFiles=null;} private CKFromSumFile() {
config = null;
sumFiles = null;
}
public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) { public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) {
this.config = config; this.config = config;
@@ -136,8 +151,7 @@ public class CKFromSumFile implements TerrasaurTool {
File commentFile = new File(basename + "-comments.txt"); File commentFile = new File(basename + "-comments.txt");
if (commentFile.exists()) if (commentFile.exists())
if (!commentFile.delete()) if (!commentFile.delete()) logger.error("{} exists but cannot be deleted!", commentFile.getPath());
logger.error("{} exists but cannot be deleted!", commentFile.getPath());
String setupFile = basename + ".setup"; String setupFile = basename + ".setup";
String inputFile = basename + ".inp"; String inputFile = basename + ".inp";
@@ -153,14 +167,21 @@ public class CKFromSumFile implements TerrasaurTool {
sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile))); sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile)));
} }
sb.append("\n"); sb.append("\n");
sb.append("providing the orientation of ").append(scFrame.getName()).append(" with respect to ").append(config.J2000() ? "J2000" : bodyFixed.getName()).append(". "); sb.append("providing the orientation of ")
.append(scFrame.getName())
.append(" with respect to ")
.append(config.J2000() ? "J2000" : bodyFixed.getName())
.append(". ");
double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds(); double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds();
double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend(); double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend();
sb.append("The coverage period is ").append(new TDBTime(first).toUTCString("ISOC", 3)).append(" to ").append(new TDBTime(last).toUTCString("ISOC", 3)).append(" UTC."); sb.append("The coverage period is ")
.append(new TDBTime(first).toUTCString("ISOC", 3))
.append(" to ")
.append(new TDBTime(last).toUTCString("ISOC", 3))
.append(" UTC.");
String allComments = sb.toString(); String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n")) for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) { } catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
} }
@@ -171,10 +192,8 @@ public class CKFromSumFile implements TerrasaurTool {
map.put("CK_TYPE", "3"); map.put("CK_TYPE", "3");
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath())); map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode())); map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode()));
map.put("REFERENCE_FRAME_NAME", map.put("REFERENCE_FRAME_NAME", String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName())); if (!config.fk().isEmpty()) map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
if (!config.fk().isEmpty())
map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'"); map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
map.put("INPUT_TIME_TYPE", "'UTC'"); map.put("INPUT_TIME_TYPE", "'UTC'");
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'"); map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
@@ -202,12 +221,9 @@ public class CKFromSumFile implements TerrasaurTool {
Vector3 row1 = rows[Math.abs(config.flipY()) - 1]; Vector3 row1 = rows[Math.abs(config.flipY()) - 1];
Vector3 row2 = rows[Math.abs(config.flipZ()) - 1]; Vector3 row2 = rows[Math.abs(config.flipZ()) - 1];
if (config.flipX() < 0) if (config.flipX() < 0) row0 = row0.negate();
row0 = row0.negate(); if (config.flipY() < 0) row1 = row1.negate();
if (config.flipY() < 0) if (config.flipZ() < 0) row2 = row2.negate();
row1 = row1.negate();
if (config.flipZ() < 0)
row2 = row2.negate();
Matrix33 refToInstr = new Matrix33(row0, row1, row2); Matrix33 refToInstr = new Matrix33(row0, row1, row2);
@@ -231,8 +247,9 @@ public class CKFromSumFile implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) { try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
for (double t : attitudeMap.keySet()) { for (double t : attitudeMap.keySet()) {
SpiceQuaternion q = attitudeMap.get(t); SpiceQuaternion q = attitudeMap.get(t);
pw.printf("%s %.14e %.14e %.14e %.14e\n", new TDBTime(t).toUTCString("ISOC", 6), pw.printf(
q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3)); "%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6), q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
} }
} catch (IOException e) { } catch (IOException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
@@ -277,24 +294,21 @@ public class CKFromSumFile implements TerrasaurTool {
CKFromSumFileConfig appConfig = factory.fromConfig(config); CKFromSumFileConfig appConfig = factory.fromConfig(config);
for (String kernel : appConfig.metakernel()) for (String kernel : appConfig.metakernel()) KernelDatabase.load(kernel);
KernelDatabase.load(kernel);
NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> { NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> {
try { try {
return Double.compare(new TDBTime(o1.utcString()).getTDBSeconds(), return Double.compare(
new TDBTime(o2.utcString()).getTDBSeconds()); new TDBTime(o1.utcString()).getTDBSeconds(), new TDBTime(o2.utcString()).getTDBSeconds());
} catch (SpiceErrorException e) { } catch (SpiceErrorException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
} }
return 0; return 0;
}); });
List<String> lines = List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
for (String line : lines) { for (String line : lines) {
if (line.strip().startsWith("#")) if (line.strip().startsWith("#")) continue;
continue;
String[] parts = line.strip().split("\\s+"); String[] parts = line.strip().split("\\s+");
String filename = parts[0].trim(); String filename = parts[0].trim();
sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename)); sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename));
@@ -305,13 +319,10 @@ public class CKFromSumFile implements TerrasaurTool {
TDBTime end = new TDBTime(sumFiles.lastKey().utcString()); TDBTime end = new TDBTime(sumFiles.lastKey().utcString());
String picture = "YYYY_DOY"; String picture = "YYYY_DOY";
String command = app.writeMSOPCKFiles( String command = app.writeMSOPCKFiles(
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)), String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)), new ArrayList<>());
new ArrayList<>());
logger.info("To generate the CK, run:\n\t{}", command); logger.info("To generate the CK, run:\n\t{}", command);
logger.info("Finished."); logger.info("Finished.");
} }
} }

View File

@@ -136,8 +136,10 @@ public class ColorSpots implements TerrasaurTool {
} }
} else { } else {
if (format == FORMAT.LLR) { if (format == FORMAT.LLR) {
double lon = Math.toRadians(Double.parseDouble(record.get(0).trim())); double lon =
double lat = Math.toRadians(Double.parseDouble(record.get(1).trim())); Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat =
Math.toRadians(Double.parseDouble(record.get(1).trim()));
double rad = Double.parseDouble(record.get(2).trim()); double rad = Double.parseDouble(record.get(2).trim());
Vector3D xyz = new Vector3D(lon, lat).scalarMultiply(rad); Vector3D xyz = new Vector3D(lon, lat).scalarMultiply(rad);
values[0] = xyz.getX(); values[0] = xyz.getX();
@@ -166,9 +168,7 @@ public class ColorSpots implements TerrasaurTool {
public TreeMap<Long, DescriptiveStatistics> getStatsFast( public TreeMap<Long, DescriptiveStatistics> getStatsFast(
ArrayList<double[]> valuesList, double radius, boolean weight, boolean atVertices) { ArrayList<double[]> valuesList, double radius, boolean weight, boolean atVertices) {
return atVertices return atVertices ? getStatsVertex(valuesList, radius, weight) : getStatsFacet(valuesList, radius, weight);
? getStatsVertex(valuesList, radius, weight)
: getStatsFacet(valuesList, radius, weight);
} }
private TreeMap<Long, DescriptiveStatistics> getStatsVertex( private TreeMap<Long, DescriptiveStatistics> getStatsVertex(
@@ -257,8 +257,7 @@ public class ColorSpots implements TerrasaurTool {
return statMap; return statMap;
} }
public TreeMap<Integer, DescriptiveStatistics> getStats( public TreeMap<Integer, DescriptiveStatistics> getStats(ArrayList<double[]> valuesList, double radius) {
ArrayList<double[]> valuesList, double radius) {
// for each value, store indices of closest cells and distances // for each value, store indices of closest cells and distances
TreeMap<Integer, ArrayList<Pair<Long, Double>>> closestCells = new TreeMap<>(); TreeMap<Integer, ArrayList<Pair<Long, Double>>> closestCells = new TreeMap<>();
@@ -266,8 +265,7 @@ public class ColorSpots implements TerrasaurTool {
double[] values = valuesList.get(i); double[] values = valuesList.get(i);
Vector3D xyz = new Vector3D(values); Vector3D xyz = new Vector3D(values);
TreeSet<Long> sortedCellIDs = TreeSet<Long> sortedCellIDs = new TreeSet<>(smallBodyModel.findClosestCellsWithinRadius(values, radius));
new TreeSet<>(smallBodyModel.findClosestCellsWithinRadius(values, radius));
sortedCellIDs.add(smallBodyModel.findClosestCell(values)); sortedCellIDs.add(smallBodyModel.findClosestCell(values));
ArrayList<Pair<Long, Double>> distances = new ArrayList<>(); ArrayList<Pair<Long, Double>> distances = new ArrayList<>();
@@ -329,82 +327,69 @@ public class ColorSpots implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("additionalFields")
Option.builder("additionalFields")
.hasArg() .hasArg()
.desc( .desc(
"Specify additional fields to write out. Allowed values are min, max, median, n, rms, sum, std, variance. " "Specify additional fields to write out. Allowed values are min, max, median, n, rms, sum, std, variance. "
+ "More than one field may be specified in a comma separated list (e.g. " + "More than one field may be specified in a comma separated list (e.g. "
+ "-additionalFields sum,median,rms). Additional fields will be written out after the mean and std columns.") + "-additionalFields sum,median,rms). Additional fields will be written out after the mean and std columns.")
.build()); .build());
options.addOption( options.addOption(Option.builder("allFacets")
Option.builder("allFacets") .desc("Report values for all facets in OBJ shape model, even if facet is not within searchRadius "
.desc(
"Report values for all facets in OBJ shape model, even if facet is not within searchRadius "
+ "of any points. Prints NaN if facet not within searchRadius. Default is to only " + "of any points. Prints NaN if facet not within searchRadius. Default is to only "
+ "print facets which have contributions from input points.") + "print facets which have contributions from input points.")
.build()); .build());
options.addOption( options.addOption(Option.builder("info")
Option.builder("info")
.required() .required()
.hasArg() .hasArg()
.desc( .desc(
"Required. Name of CSV file containing value to plot." "Required. Name of CSV file containing value to plot."
+ " Default format is lon, lat, radius, value. See -xyz and -llOnly options for alternate formats.") + " Default format is lon, lat, radius, value. See -xyz and -llOnly options for alternate formats.")
.build()); .build());
options.addOption( options.addOption(Option.builder("llOnly")
Option.builder("llOnly").desc("Format of -info file is lon, lat, value.").build()); .desc("Format of -info file is lon, lat, value.")
options.addOption( .build());
Option.builder("logFile") options.addOption(Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("normalize")
Option.builder("normalize") .desc("Report values per unit area (divide by total area of facets within search ellipse).")
.desc(
"Report values per unit area (divide by total area of facets within search ellipse).")
.build()); .build());
options.addOption( options.addOption(Option.builder("noWeight")
Option.builder("noWeight")
.desc("Do not weight points by distance from facet/vertex.") .desc("Do not weight points by distance from facet/vertex.")
.build()); .build());
options.addOption( options.addOption(Option.builder("obj")
Option.builder("obj")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of shape model to read.") .desc("Required. Name of shape model to read.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outFile")
Option.builder("outFile")
.hasArg() .hasArg()
.desc("Specify output file to store the output.") .desc("Specify output file to store the output.")
.build()); .build());
options.addOption( options.addOption(Option.builder("searchRadius")
Option.builder("searchRadius")
.hasArg() .hasArg()
.desc( .desc(
"Each facet will be colored using a weighted average of all points within searchRadius of the facet/vertex. " "Each facet will be colored using a weighted average of all points within searchRadius of the facet/vertex. "
+ "If not present, set to sqrt(2)/2 * mean facet edge length.") + "If not present, set to sqrt(2)/2 * mean facet edge length.")
.build()); .build());
options.addOption( options.addOption(Option.builder("writeVertices")
Option.builder("writeVertices") .desc("Convert output from a per facet to per vertex format. Each line will be of the form"
.desc(
"Convert output from a per facet to per vertex format. Each line will be of the form"
+ " x, y, z, value, sigma where x, y, z are the vector components of vertex V. " + " x, y, z, value, sigma where x, y, z are the vector components of vertex V. "
+ " Default is to only report facetID, facet_value, facet_sigma.") + " Default is to only report facetID, facet_value, facet_sigma.")
.build()); .build());
options.addOption( options.addOption(Option.builder("xyz")
Option.builder("xyz").desc("Format of -info file is x, y, z, value.").build()); .desc("Format of -info file is x, y, z, value.")
.build());
return options; return options;
} }
@@ -449,8 +434,7 @@ public class ColorSpots implements TerrasaurTool {
ColorSpots cs = new ColorSpots(polyData); ColorSpots cs = new ColorSpots(polyData);
ArrayList<double[]> infoValues = cs.readCSV(cl.getOptionValue("info"), format); ArrayList<double[]> infoValues = cs.readCSV(cl.getOptionValue("info"), format);
TreeMap<Long, DescriptiveStatistics> statMap = TreeMap<Long, DescriptiveStatistics> statMap = cs.getStatsFast(infoValues, radius, weight, writeVerts);
cs.getStatsFast(infoValues, radius, weight, writeVerts);
double totalArea = 0; double totalArea = 0;
if (normalize) { if (normalize) {
@@ -461,8 +445,7 @@ public class ColorSpots implements TerrasaurTool {
double[] pt1 = points.GetPoint(1); double[] pt1 = points.GetPoint(1);
double[] pt2 = points.GetPoint(2); double[] pt2 = points.GetPoint(2);
TriangularFacet tf = TriangularFacet tf = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = tf.getArea(); double area = tf.getArea();
totalArea += area; totalArea += area;
@@ -473,7 +456,8 @@ public class ColorSpots implements TerrasaurTool {
ArrayList<FIELD> fields = new ArrayList<>(); ArrayList<FIELD> fields = new ArrayList<>();
if (cl.hasOption("additionalFields")) { if (cl.hasOption("additionalFields")) {
for (String s : cl.getOptionValue("additionalFields").trim().toUpperCase().split(",")) { for (String s :
cl.getOptionValue("additionalFields").trim().toUpperCase().split(",")) {
for (FIELD f : FIELD.values()) { for (FIELD f : FIELD.values()) {
if (f.name().equalsIgnoreCase(s)) fields.add(f); if (f.name().equalsIgnoreCase(s)) fields.add(f);
} }
@@ -533,10 +517,7 @@ public class ColorSpots implements TerrasaurTool {
} }
private static ArrayList<String> writeFacets( private static ArrayList<String> writeFacets(
TreeMap<Long, ArrayList<Double>> map, TreeMap<Long, ArrayList<Double>> map, boolean allFacets, boolean normalize, double totalArea) {
boolean allFacets,
boolean normalize,
double totalArea) {
ArrayList<String> returnList = new ArrayList<>(); ArrayList<String> returnList = new ArrayList<>();
@@ -574,8 +555,7 @@ public class ColorSpots implements TerrasaurTool {
if (allFacets || !value.isNaN()) { if (allFacets || !value.isNaN()) {
// get vertex x,y,z values // get vertex x,y,z values
polyData.GetPoint(vertex, thisPt); polyData.GetPoint(vertex, thisPt);
StringBuilder sb = StringBuilder sb = new StringBuilder(
new StringBuilder(
String.format("%e, %e, %e, %e, %e", thisPt[0], thisPt[1], thisPt[2], value, sigma)); String.format("%e, %e, %e, %e, %e", thisPt[0], thisPt[1], thisPt[2], value, sigma));
for (int i = 2; i < values.size(); i++) { for (int i = 2; i < values.size(); i++) {
value = values.get(i); value = values.get(i);

View File

@@ -157,8 +157,7 @@ public class CompareOBJ implements TerrasaurTool {
* @param useOverlappingPoints if true, only points overlapping the reference model will be added * @param useOverlappingPoints if true, only points overlapping the reference model will be added
* to the track file. * to the track file.
*/ */
private void convertPolyDataToTrackFormat( private void convertPolyDataToTrackFormat(vtkPolyData polydata, String filename, boolean useOverlappingPoints) {
vtkPolyData polydata, String filename, boolean useOverlappingPoints) {
vtkPoints polyDataPoints = polydata.GetPoints(); vtkPoints polyDataPoints = polydata.GetPoints();
vtkPoints points = polyDataPoints; vtkPoints points = polyDataPoints;
@@ -172,9 +171,7 @@ public class CompareOBJ implements TerrasaurTool {
double[] p = new double[3]; double[] p = new double[3];
for (int i = 0; i < points.GetNumberOfPoints(); ++i) { for (int i = 0; i < points.GetNumberOfPoints(); ++i) {
points.GetPoint(i, p); points.GetPoint(i, p);
pw.write( pw.write(String.format("2010-11-11T00:00:00.000 "
String.format(
"2010-11-11T00:00:00.000 "
+ p[0] + p[0]
+ " " + " "
+ p[1] + p[1]
@@ -208,9 +205,7 @@ public class CompareOBJ implements TerrasaurTool {
int maxNumberOfControlPoints, int maxNumberOfControlPoints,
boolean useOverlappingPoints, boolean useOverlappingPoints,
String transformationFile) { String transformationFile) {
File tmpDir = File tmpDir = new File(String.format("%s%sCompareOBJ-%d", tmpdir, File.separator, System.currentTimeMillis()));
new File(
String.format("%s%sCompareOBJ-%d", tmpdir, File.separator, System.currentTimeMillis()));
if (!tmpDir.exists()) tmpDir.mkdirs(); if (!tmpDir.exists()) tmpDir.mkdirs();
String trackFile = tmpDir + File.separator + "shapemodel-as-track.txt"; String trackFile = tmpDir + File.separator + "shapemodel-as-track.txt";
@@ -223,8 +218,7 @@ public class CompareOBJ implements TerrasaurTool {
else if (computeOptimalRotation) transformationType = "--rotation-only "; else if (computeOptimalRotation) transformationType = "--rotation-only ";
File lsk = ResourceUtils.writeResourceToFile("/resources/kernels/lsk/naif0012.tls"); File lsk = ResourceUtils.writeResourceToFile("/resources/kernels/lsk/naif0012.tls");
String command = String command = "lidar-optimize --max-number-control-points "
"lidar-optimize --max-number-control-points "
+ maxNumberOfControlPoints + maxNumberOfControlPoints
+ " " + " "
+ transformationType + transformationType
@@ -249,8 +243,7 @@ public class CompareOBJ implements TerrasaurTool {
// save a copy of the JSON transformation file // save a copy of the JSON transformation file
if (transformationFile != null) { if (transformationFile != null) {
try (PrintWriter pw = new PrintWriter(transformationFile)) { try (PrintWriter pw = new PrintWriter(transformationFile)) {
List<String> lines = List<String> lines = FileUtils.readLines(new File(tmpTransformationFile), Charset.defaultCharset());
FileUtils.readLines(new File(tmpTransformationFile), Charset.defaultCharset());
for (String line : lines) pw.println(line); for (String line : lines) pw.println(line);
} catch (IOException e) { } catch (IOException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
@@ -436,8 +429,7 @@ public class CompareOBJ implements TerrasaurTool {
sumDiff *= radius; sumDiff *= radius;
System.out.printf( System.out.printf(
"Sum normalized difference in horizontal distances (%d points used): %f meters\n", "Sum normalized difference in horizontal distances (%d points used): %f meters\n", npts, sumDiff);
npts, sumDiff);
} }
/** /**
@@ -520,8 +512,7 @@ public class CompareOBJ implements TerrasaurTool {
pointLocator.FindPointsWithinRadius(radius, p.toArray(), idList); pointLocator.FindPointsWithinRadius(radius, p.toArray(), idList);
if (idList.GetNumberOfIds() < 3) { if (idList.GetNumberOfIds() < 3) {
logger.error( logger.error(String.format(
String.format(
"point %d (%f %f %f): %d points within %f, using radial vector to find intersection", "point %d (%f %f %f): %d points within %f, using radial vector to find intersection",
i, p.getX(), p.getY(), p.getZ(), idList.GetNumberOfIds(), radius)); i, p.getX(), p.getY(), p.getZ(), idList.GetNumberOfIds(), radius));
normal = p.normalize(); normal = p.normalize();
@@ -547,8 +538,7 @@ public class CompareOBJ implements TerrasaurTool {
} }
} }
Optional<Vector3D> normalPoint = Optional<Vector3D> normalPoint = findIntersectPointInNormalDirection(p, normal, smallBodyTruth);
findIntersectPointInNormalDirection(p, normal, smallBodyTruth);
DistanceContainer dc = null; DistanceContainer dc = null;
// Skip this plate in the error calculation if there is no intersection // Skip this plate in the error calculation if there is no intersection
@@ -569,8 +559,7 @@ public class CompareOBJ implements TerrasaurTool {
if (saveIndex) { if (saveIndex) {
long closestFacet = smallBodyTruth.findClosestCell(p.toArray()); long closestFacet = smallBodyTruth.findClosestCell(p.toArray());
outClosestIndices.write( outClosestIndices.write(String.format("%d, %d, %f\n", i, closestFacet, closestDistance));
String.format("%d, %d, %f\n", i, closestFacet, closestDistance));
} }
} }
@@ -630,39 +619,33 @@ public class CompareOBJ implements TerrasaurTool {
} }
minDist = distanceContainerVector.get(0).getClosestDistance(); minDist = distanceContainerVector.get(0).getClosestDistance();
maxDist = maxDist = distanceContainerVector
distanceContainerVector.get(distanceContainerVector.size() - 1).getClosestDistance(); .get(distanceContainerVector.size() - 1)
.getClosestDistance();
} }
Vector3D translation = transform.getTranslation(); Vector3D translation = transform.getTranslation();
String tmpString = String tmpString =
String.format( String.format("%16.8e,%16.8e,%16.8e", translation.getX(), translation.getY(), translation.getZ());
"%16.8e,%16.8e,%16.8e", translation.getX(), translation.getY(), translation.getZ());
System.out.println("Translation: " + tmpString.replaceAll("\\s+", "")); System.out.println("Translation: " + tmpString.replaceAll("\\s+", ""));
Rotation rotation = transform.getRotation(); Rotation rotation = transform.getRotation();
Rotation inverse = Rotation inverse = rotation.composeInverse(Rotation.IDENTITY, RotationConvention.FRAME_TRANSFORM);
rotation.composeInverse(Rotation.IDENTITY, RotationConvention.FRAME_TRANSFORM); Quaternion q = new Quaternion(inverse.getQ0(), inverse.getQ1(), inverse.getQ2(), inverse.getQ3())
Quaternion q =
new Quaternion(inverse.getQ0(), inverse.getQ1(), inverse.getQ2(), inverse.getQ3())
.getPositivePolarForm(); .getPositivePolarForm();
tmpString = tmpString = String.format("%16.8e,%16.8e,%16.8e,%16.8e", q.getQ0(), q.getQ1(), q.getQ2(), q.getQ3());
String.format("%16.8e,%16.8e,%16.8e,%16.8e", q.getQ0(), q.getQ1(), q.getQ2(), q.getQ3());
System.out.println("Rotation quaternion: " + tmpString.replaceAll("\\s+", "")); System.out.println("Rotation quaternion: " + tmpString.replaceAll("\\s+", ""));
Vector3D axis = inverse.getAxis(RotationConvention.FRAME_TRANSFORM); Vector3D axis = inverse.getAxis(RotationConvention.FRAME_TRANSFORM);
tmpString = tmpString = String.format(
String.format(
"%16.8e,%16.8e,%16.8e,%16.8e", "%16.8e,%16.8e,%16.8e,%16.8e",
Math.toDegrees(inverse.getAngle()), axis.getX(), axis.getY(), axis.getZ()); Math.toDegrees(inverse.getAngle()), axis.getX(), axis.getY(), axis.getZ());
System.out.println("Rotation angle (degrees) and axis: " + tmpString.replaceAll("\\s+", "")); System.out.println("Rotation angle (degrees) and axis: " + tmpString.replaceAll("\\s+", ""));
Vector3D centerOfRotation = transform.getCenterOfRotation(); Vector3D centerOfRotation = transform.getCenterOfRotation();
tmpString = tmpString = String.format(
String.format( "%16.8e,%16.8e,%16.8e", centerOfRotation.getX(), centerOfRotation.getY(), centerOfRotation.getZ());
"%16.8e,%16.8e,%16.8e",
centerOfRotation.getX(), centerOfRotation.getY(), centerOfRotation.getZ());
System.out.println("Center of rotation: " + tmpString.replaceAll("\\s+", "")); System.out.println("Center of rotation: " + tmpString.replaceAll("\\s+", ""));
double[][] rotMatrix = inverse.getMatrix(); double[][] rotMatrix = inverse.getMatrix();
@@ -700,30 +683,25 @@ public class CompareOBJ implements TerrasaurTool {
Vector3D unitNormal = normal.normalize(); Vector3D unitNormal = normal.normalize();
org.apache.commons.math3.geometry.euclidean.threed.Plane p = org.apache.commons.math3.geometry.euclidean.threed.Plane p =
new org.apache.commons.math3.geometry.euclidean.threed.Plane( new org.apache.commons.math3.geometry.euclidean.threed.Plane(Vector3D.ZERO, unitNormal, 1e-6);
Vector3D.ZERO, unitNormal, 1e-6);
Vector3D parallelVector = p.toSpace(p.toSubSpace(translation)); Vector3D parallelVector = p.toSpace(p.toSubSpace(translation));
System.out.println( System.out.println("Direction perpendicular to plane: "
"Direction perpendicular to plane: "
+ unitNormal.getX() + unitNormal.getX()
+ " " + " "
+ unitNormal.getY() + unitNormal.getY()
+ " " + " "
+ unitNormal.getZ()); + unitNormal.getZ());
System.out.println( System.out.println(
"Magnitude of projection perpendicular to plane: " "Magnitude of projection perpendicular to plane: " + translation.dotProduct(unitNormal));
+ translation.dotProduct(unitNormal)); System.out.println("Projection vector of translation parallel to plane: "
System.out.println(
"Projection vector of translation parallel to plane: "
+ parallelVector.getX() + parallelVector.getX()
+ " " + " "
+ parallelVector.getY() + parallelVector.getY()
+ " " + " "
+ parallelVector.getZ()); + parallelVector.getZ());
System.out.println( System.out.println(
"Magnitude of projection vector of translation parallel to plane: " "Magnitude of projection vector of translation parallel to plane: " + parallelVector.getNorm());
+ parallelVector.getNorm());
/*- SPICE /*- SPICE
@@ -752,8 +730,7 @@ public class CompareOBJ implements TerrasaurTool {
} }
System.out.println( System.out.println(numPlatesActuallyUsed
numPlatesActuallyUsed
+ " plates used in error calculation out of " + " plates used in error calculation out of "
+ polyDataModel.GetNumberOfCells() + polyDataModel.GetNumberOfCells()
+ " total in the shape model"); + " total in the shape model");
@@ -777,14 +754,13 @@ public class CompareOBJ implements TerrasaurTool {
// First do the intersection from "below" // First do the intersection from "below"
Vector3D startBottom = in.subtract(normal.scalarMultiply(size)); Vector3D startBottom = in.subtract(normal.scalarMultiply(size));
double[] out1 = new double[3]; double[] out1 = new double[3];
long cellId1 = long cellId1 = smallBodyModel.computeRayIntersection(startBottom.toArray(), normal.toArray(), out1);
smallBodyModel.computeRayIntersection(startBottom.toArray(), normal.toArray(), out1);
// Then do the intersection from on "top" // Then do the intersection from on "top"
Vector3D startTop = in.add(normal.scalarMultiply(size)); Vector3D startTop = in.add(normal.scalarMultiply(size));
double[] out2 = new double[3]; double[] out2 = new double[3];
long cellId2 = long cellId2 = smallBodyModel.computeRayIntersection(
smallBodyModel.computeRayIntersection(startTop.toArray(), normal.negate().toArray(), out2); startTop.toArray(), normal.negate().toArray(), out2);
if (cellId1 >= 0 && cellId2 >= 0) { if (cellId1 >= 0 && cellId2 >= 0) {
Vector3D out1V = new Vector3D(out1); Vector3D out1V = new Vector3D(out1);
@@ -793,8 +769,7 @@ public class CompareOBJ implements TerrasaurTool {
Vector3D inSubOut1 = in.subtract(out1V); Vector3D inSubOut1 = in.subtract(out1V);
Vector3D inSubOut2 = in.subtract(out2V); Vector3D inSubOut2 = in.subtract(out2V);
// If both intersected, take the closest // If both intersected, take the closest
if (inSubOut1.dotProduct(inSubOut1) < inSubOut2.dotProduct(inSubOut2)) if (inSubOut1.dotProduct(inSubOut1) < inSubOut2.dotProduct(inSubOut2)) return Optional.of(out1V);
return Optional.of(out1V);
else return Optional.of(out2V); else return Optional.of(out2V);
} }
if (cellId1 >= 0) return Optional.of(new Vector3D(out1)); if (cellId1 >= 0) return Optional.of(new Vector3D(out1));
@@ -806,19 +781,16 @@ public class CompareOBJ implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.argName("path") .argName("path")
.desc("If present, save screen output to <path>.") .desc("If present, save screen output to <path>.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
@@ -830,7 +802,8 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("-model is transformed to -reference using this optimal "); sb.append("-model is transformed to -reference using this optimal ");
sb.append("rotation. This results in an error unbiased by a possible rotation between "); sb.append("rotation. This results in an error unbiased by a possible rotation between ");
sb.append("the two models."); sb.append("the two models.");
options.addOption(Option.builder("computeOptimalRotation").desc(sb.toString()).build()); options.addOption(
Option.builder("computeOptimalRotation").desc(sb.toString()).build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("If specified, the program first computes the optimal translation of "); sb.append("If specified, the program first computes the optimal translation of ");
@@ -839,7 +812,8 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("is transformed to the -reference using this optimal translation. "); sb.append("is transformed to the -reference using this optimal translation. ");
sb.append("This results in an error unbiased by a possible translation offset between "); sb.append("This results in an error unbiased by a possible translation offset between ");
sb.append("the two models."); sb.append("the two models.");
options.addOption(Option.builder("computeOptimalTranslation").desc(sb.toString()).build()); options.addOption(
Option.builder("computeOptimalTranslation").desc(sb.toString()).build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("If specified, the program first computes the optimal translation and "); sb.append("If specified, the program first computes the optimal translation and ");
@@ -848,16 +822,16 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("-model is transformed to -reference using this optimal "); sb.append("-model is transformed to -reference using this optimal ");
sb.append("translation and rotation. This results in an error unbiased by a possible "); sb.append("translation and rotation. This results in an error unbiased by a possible ");
sb.append("translation offset or rotation between the two models."); sb.append("translation offset or rotation between the two models.");
options.addOption( options.addOption(Option.builder("computeOptimalRotationAndTranslation")
Option.builder("computeOptimalRotationAndTranslation").desc(sb.toString()).build()); .desc(sb.toString())
.build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("If specified, the program computes the error in the vertical direction "); sb.append("If specified, the program computes the error in the vertical direction ");
sb.append("(by fitting a plane to -model) and saves it to <path>. This option "); sb.append("(by fitting a plane to -model) and saves it to <path>. This option ");
sb.append("only produces meaningful results for digital terrain models to which a plane "); sb.append("only produces meaningful results for digital terrain models to which a plane ");
sb.append("can be fit."); sb.append("can be fit.");
options.addOption( options.addOption(Option.builder("computeVerticalError")
Option.builder("computeVerticalError")
.hasArg() .hasArg()
.argName("path") .argName("path")
.desc(sb.toString()) .desc(sb.toString())
@@ -869,42 +843,47 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("present, only points within a distance of radius will be used to construct the "); sb.append("present, only points within a distance of radius will be used to construct the ");
sb.append("plane. Recommended value is ~5% of the body radius for a global model. Units "); sb.append("plane. Recommended value is ~5% of the body radius for a global model. Units ");
sb.append("are units of the shape model."); sb.append("are units of the shape model.");
options.addOption(Option.builder("fitPlaneRadius").hasArg().desc(sb.toString()).build()); options.addOption(
Option.builder("fitPlaneRadius").hasArg().desc(sb.toString()).build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Limit the distances (described in -savePlateDiff) used in calculating the mean "); sb.append("Limit the distances (described in -savePlateDiff) used in calculating the mean ");
sb.append("distance and RMS distance to the closest fraction of all distances."); sb.append("distance and RMS distance to the closest fraction of all distances.");
options.addOption(Option.builder("limitClosestPoints").hasArg().desc(sb.toString()).build()); options.addOption(Option.builder("limitClosestPoints")
.hasArg()
.desc(sb.toString())
.build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Max number of control points to use in optimization. Default is 2000."); sb.append("Max number of control points to use in optimization. Default is 2000.");
options.addOption( options.addOption(Option.builder("maxNumberControlPoints")
Option.builder("maxNumberControlPoints").hasArg().desc(sb.toString()).build()); .hasArg()
.desc(sb.toString())
.build());
options.addOption( options.addOption(Option.builder("model")
Option.builder("model")
.required() .required()
.hasArg() .hasArg()
.argName("path") .argName("path")
.desc( .desc("Required. Point cloud/shape file to compare to reference shape. Valid formats are "
"Required. Point cloud/shape file to compare to reference shape. Valid formats are "
+ "anything that can be read by the PointCloudFormatConverter.") + "anything that can be read by the PointCloudFormatConverter.")
.build()); .build());
options.addOption( options.addOption(Option.builder("reference")
Option.builder("reference")
.required() .required()
.hasArg() .hasArg()
.argName("path") .argName("path")
.desc( .desc("Required. Reference shape file. Valid formats are FITS, ICQ, OBJ, PLT, PLY, or VTK.")
"Required. Reference shape file. Valid formats are FITS, ICQ, OBJ, PLT, PLY, or VTK.")
.build()); .build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Save the rotated and/or translated -model to <path>. "); sb.append("Save the rotated and/or translated -model to <path>. ");
sb.append("This option requires one of -computeOptimalRotation, -computeOptimalTranslation "); sb.append("This option requires one of -computeOptimalRotation, -computeOptimalTranslation ");
sb.append("or -computeOptimalRotationAndTranslation to be specified."); sb.append("or -computeOptimalRotationAndTranslation to be specified.");
options.addOption( options.addOption(Option.builder("saveOptimalShape")
Option.builder("saveOptimalShape").hasArg().argName("path").desc(sb.toString()).build()); .hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Save to a file specified by <path> the distances (in same units as the shape "); sb.append("Save to a file specified by <path> the distances (in same units as the shape ");
@@ -912,8 +891,11 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("-reference. The number of lines in the file equals the number of plates in "); sb.append("-reference. The number of lines in the file equals the number of plates in ");
sb.append("the first shape model with each line containing the distance of that plate "); sb.append("the first shape model with each line containing the distance of that plate ");
sb.append("to the closest point in the second shape model."); sb.append("to the closest point in the second shape model.");
options.addOption( options.addOption(Option.builder("savePlateDiff")
Option.builder("savePlateDiff").hasArg().argName("path").desc(sb.toString()).build()); .hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Save to a file specified by <path> the index of the closest plate in "); sb.append("Save to a file specified by <path> the index of the closest plate in ");
@@ -921,13 +903,15 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("The format of each line is "); sb.append("The format of each line is ");
sb.append("\n\tplate index, closest reference plate index, distance\n"); sb.append("\n\tplate index, closest reference plate index, distance\n");
sb.append("Only valid when -model is a shape model with facet information."); sb.append("Only valid when -model is a shape model with facet information.");
options.addOption( options.addOption(Option.builder("savePlateIndex")
Option.builder("savePlateIndex").hasArg().argName("path").desc(sb.toString()).build()); .hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Save output of lidar-optimize to <path> (JSON format file)"); sb.append("Save output of lidar-optimize to <path> (JSON format file)");
options.addOption( options.addOption(Option.builder("saveTransformationFile")
Option.builder("saveTransformationFile")
.hasArg() .hasArg()
.argName("path") .argName("path")
.desc(sb.toString()) .desc(sb.toString())
@@ -936,8 +920,11 @@ public class CompareOBJ implements TerrasaurTool {
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Directory to store temporary files. It will be created if it does not exist. "); sb.append("Directory to store temporary files. It will be created if it does not exist. ");
sb.append("Default is the current working directory."); sb.append("Default is the current working directory.");
options.addOption( options.addOption(Option.builder("tmpDir")
Option.builder("tmpDir").hasArg().argName("path").desc(sb.toString()).build()); .hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder(); sb = new StringBuilder();
sb.append("Use all points in -model when attempting to fit to "); sb.append("Use all points in -model when attempting to fit to ");
@@ -960,34 +947,25 @@ public class CompareOBJ implements TerrasaurTool {
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml))); logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
double limitClosestPoints = double limitClosestPoints =
cl.hasOption("limitClosestPoints") cl.hasOption("limitClosestPoints") ? Double.parseDouble(cl.getOptionValue("limitClosestPoints")) : 1.0;
? Double.parseDouble(cl.getOptionValue("limitClosestPoints"))
: 1.0;
limitClosestPoints = Math.max(0, Math.min(1, limitClosestPoints)); limitClosestPoints = Math.max(0, Math.min(1, limitClosestPoints));
boolean saveOptimalShape = cl.hasOption("saveOptimalShape"); boolean saveOptimalShape = cl.hasOption("saveOptimalShape");
boolean computeOptimalTranslation = boolean computeOptimalTranslation =
cl.hasOption("computeOptimalRotationAndTranslation") cl.hasOption("computeOptimalRotationAndTranslation") || cl.hasOption("computeOptimalTranslation");
|| cl.hasOption("computeOptimalTranslation");
boolean computeOptimalRotation = boolean computeOptimalRotation =
cl.hasOption("computeOptimalRotationAndTranslation") cl.hasOption("computeOptimalRotationAndTranslation") || cl.hasOption("computeOptimalRotation");
|| cl.hasOption("computeOptimalRotation");
final boolean computeHorizontalError = false; final boolean computeHorizontalError = false;
boolean useOverlappingPoints = !cl.hasOption("useAllPoints"); boolean useOverlappingPoints = !cl.hasOption("useAllPoints");
double planeRadius = double planeRadius =
cl.hasOption("fitPlaneRadius") cl.hasOption("fitPlaneRadius") ? Double.parseDouble(cl.getOptionValue("fitPlaneRadius")) : 0;
? Double.parseDouble(cl.getOptionValue("fitPlaneRadius")) int maxNumberOfControlPoints = cl.hasOption("maxNumberControlPoints")
: 0;
int maxNumberOfControlPoints =
cl.hasOption("maxNumberControlPoints")
? Integer.parseInt(cl.getOptionValue("maxNumberControlPoints")) ? Integer.parseInt(cl.getOptionValue("maxNumberControlPoints"))
: 2000; : 2000;
int npoints = -1; int npoints = -1;
double size = 0; double size = 0;
String closestDiffFile = String closestDiffFile = cl.hasOption("savePlateDiff") ? cl.getOptionValue("savePlateDiff") : null;
cl.hasOption("savePlateDiff") ? cl.getOptionValue("savePlateDiff") : null; String closestIndexFile = cl.hasOption("savePlateIndex") ? cl.getOptionValue("savePlateIndex") : null;
String closestIndexFile =
cl.hasOption("savePlateIndex") ? cl.getOptionValue("savePlateIndex") : null;
String optimalShapeFile = saveOptimalShape ? cl.getOptionValue("saveOptimalShape") : null; String optimalShapeFile = saveOptimalShape ? cl.getOptionValue("saveOptimalShape") : null;
String verticalDiffFile = String verticalDiffFile =
cl.hasOption("computeVerticalError") ? cl.getOptionValue("computeVerticalError") : null; cl.hasOption("computeVerticalError") ? cl.getOptionValue("computeVerticalError") : null;

View File

@@ -60,8 +60,7 @@ public class CreateSBMTStructure implements TerrasaurTool {
@Override @Override
public String fullDescription(Options options) { public String fullDescription(Options options) {
String header = String header = "This tool creates an SBMT ellipse file from a set of points on an image.";
"This tool creates an SBMT ellipse file from a set of point on an image.";
String footer = ""; String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer); return TerrasaurTool.super.fullDescription(options, header, footer);
} }
@@ -75,8 +74,7 @@ public class CreateSBMTStructure implements TerrasaurTool {
* @param p3 Third point * @param p3 Third point
* @return An SBMT structure describing the ellipse * @return An SBMT structure describing the ellipse
*/ */
private static SBMTEllipseRecord createRecord( private static SBMTEllipseRecord createRecord(int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
// Create a local coordinate system where X axis contains long axis and Y axis contains short // Create a local coordinate system where X axis contains long axis and Y axis contains short
// axis // axis
@@ -104,8 +102,7 @@ public class CreateSBMTStructure implements TerrasaurTool {
double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX()); double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX());
double flattening = (majorAxis - minorAxis) / majorAxis; double flattening = (majorAxis - minorAxis) / majorAxis;
ImmutableSBMTEllipseRecord.Builder record = ImmutableSBMTEllipseRecord.Builder record = ImmutableSBMTEllipseRecord.builder()
ImmutableSBMTEllipseRecord.builder()
.id(id) .id(id)
.name(name) .name(name)
.x(origin.getX()) .x(origin.getX())
@@ -129,43 +126,70 @@ public class CreateSBMTStructure implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("flipX")
Option.builder("input") .desc("If present, negate the X coordinate of the input points.")
.build());
options.addOption(Option.builder("flipY")
.desc("If present, negate the Y coordinate of the input points.")
.build());
options.addOption(Option.builder("input")
.required() .required()
.hasArg() .hasArg()
.desc( .desc(
""" """
Required. Name or input file. This is a text file with a pair of pixel coordinates per line. The pixel Required. Name or input file. This is a text file with a pair of pixel coordinates (X and Y) per line. The pixel
coordinates are offsets from the image center. For example: coordinates are offsets from the image center. For example:
# My test file # My test file
627.51274 876.11775 89.6628 285.01
630.53612 883.55992 97.8027 280.126
626.3499 881.46681 95.0119 285.01
-13.8299 323.616
-1.9689 331.756
-11.7367 330.826
Empty lines or lines beginning with # are ignored. Empty lines or lines beginning with # are ignored.
Each set of three points are used to create the SBMT structures. The first two points are the long Each set of three points are used to create the SBMT structures. The first two points are the long
axis and the third is a location for the semi-minor axis.""") axis and the third is a location for the semi-minor axis.""")
.build()); .build());
options.addOption( options.addOption(Option.builder("objFile")
Option.builder("objFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of OBJ shape file.") .desc("Required. Name of OBJ shape file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("output")
Option.builder("output")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of output file.") .desc("Required. Name of output file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("spice")
Option.builder("sumFile") .hasArg()
.desc(
"If present, name of metakernel to read. Other required options with -spice are -date, -observer, -target, and -cameraFrame.")
.build());
options.addOption(Option.builder("date")
.hasArgs()
.desc("Only used with -spice. Date of image (e.g. 2022 SEP 26 23:11:12.649).")
.build());
options.addOption(Option.builder("observer")
.hasArg()
.desc("Only used with -spice. Observing body (e.g. DART)")
.build());
options.addOption(Option.builder("target")
.hasArg()
.desc("Only used with -spice. Target body (e.g. DIMORPHOS).")
.build());
options.addOption(Option.builder("cameraFrame")
.hasArg()
.desc("Only used with -spice. Camera frame (e.g. DART_DRACO).")
.build());
options.addOption(Option.builder("sumFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of sum file to read.") .desc(
"Required. Name of sum file to read. This is still required with -spice, but only used as a template to create a new sum file.")
.build()); .build());
return options; return options;
} }
@@ -178,8 +202,7 @@ axis and the third is a location for the semi-minor axis.""")
CommandLine cl = defaultOBJ.parseArgs(args, options); CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl); Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadSpiceLibraries(); NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();
@@ -195,15 +218,20 @@ axis and the third is a location for the semi-minor axis.""")
} }
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData); RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
boolean flipX = cl.hasOption("flipX");
boolean flipY = cl.hasOption("flipY");
List<Vector3D> intercepts = new ArrayList<>(); List<Vector3D> intercepts = new ArrayList<>();
List<String> lines = List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset()); for (String line : lines.stream()
for (String line : .filter(s -> !(s.isBlank() || s.strip().startsWith("#")))
lines.stream().filter(s -> !(s.isBlank() || s.strip().startsWith("#"))).toList()) { .toList()) {
String[] parts = line.split("\\s+"); String[] parts = line.split("\\s+");
int ix = (int) Math.round(Double.parseDouble(parts[0])); int ix = (int) Math.round(Double.parseDouble(parts[0]));
int iy = (int) Math.round(Double.parseDouble(parts[1])); int iy = (int) Math.round(Double.parseDouble(parts[1]));
if (flipX) ix *= -1;
if (flipY) iy *= -1;
Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy); Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy);
long cellID = entry.getKey(); long cellID = entry.getKey();
if (cellID > -1) intercepts.add(entry.getValue()); if (cellID > -1) intercepts.add(entry.getValue());

View File

@@ -59,39 +59,35 @@ public class DSK2OBJ implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("body")
Option.builder("body")
.hasArg() .hasArg()
.desc( .desc("If present, convert shape for named body. Default is to use the first body in the DSK.")
"If present, convert shape for named body. Default is to use the first body in the DSK.")
.build()); .build());
options.addOption( options.addOption(Option.builder("dsk")
Option.builder("dsk").hasArg().required().desc("Required. Name of input DSK.").build()); .hasArg()
options.addOption( .required()
Option.builder("logFile") .desc("Required. Name of input DSK.")
.build());
options.addOption(Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption(Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
options.addOption( options.addOption(
Option.builder("printBodies") Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
options.addOption(Option.builder("printBodies")
.desc("If present, print bodies and surface ids in DSK.") .desc("If present, print bodies and surface ids in DSK.")
.build()); .build());
options.addOption( options.addOption(Option.builder("surface")
Option.builder("surface")
.hasArg() .hasArg()
.desc( .desc("If present, use specified surface id. Default is to use the first surface id for the body.")
"If present, use specified surface id. Default is to use the first surface id for the body.")
.build()); .build());
return options; return options;
} }
@@ -147,8 +143,7 @@ public class DSK2OBJ implements TerrasaurTool {
} }
Surface[] surfaces = dsk.getSurfaces(b); Surface[] surfaces = dsk.getSurfaces(b);
Surface s = Surface s = cl.hasOption("surface")
cl.hasOption("surface")
? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b) ? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b)
: surfaces[0]; : surfaces[0];
boolean missingSurface = true; boolean missingSurface = true;
@@ -159,10 +154,8 @@ public class DSK2OBJ implements TerrasaurTool {
} }
} }
if (missingSurface) { if (missingSurface) {
logger.warn( logger.warn(String.format(
String.format( "Surface %d for body %s not found in DSK! Valid surfaces are:", s.getIDCode(), b.getName()));
"Surface %d for body %s not found in DSK! Valid surfaces are:",
s.getIDCode(), b.getName()));
for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode())); for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode()));
System.exit(0); System.exit(0);
} }
@@ -191,8 +184,7 @@ public class DSK2OBJ implements TerrasaurTool {
for (int[] p : plates) { for (int[] p : plates) {
pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]); pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]);
} }
logger.info( logger.info(String.format(
String.format(
"Wrote %d vertices and %d plates to %s for body %d surface %d", "Wrote %d vertices and %d plates to %s for body %d surface %d",
nv[0], nv[0],
np[0], np[0],

View File

@@ -89,7 +89,6 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
private DifferentialVolumeEstimator() {} private DifferentialVolumeEstimator() {}
@Override @Override
public String shortDescription() { public String shortDescription() {
return "Find volume difference between two shape models."; return "Find volume difference between two shape models.";
@@ -102,15 +101,13 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
public String fullDescription(Options options) { public String fullDescription(Options options) {
String header = ""; String header = "";
String footer = "\nThis program finds the volume difference between a shape model and a reference surface. " String footer = "\nThis program finds the volume difference between a shape model and a reference surface. "
+"The reference surface can either be another shape model or a degree "+POLYNOMIAL_DEGREE+" fit to a set of supplied points. "+ + "The reference surface can either be another shape model or a degree " + POLYNOMIAL_DEGREE
"A local coordinate system is derived from the reference surface. The heights of the shape and reference at "+ + " fit to a set of supplied points. "
"each grid point are reported. "; + "A local coordinate system is derived from the reference surface. The heights of the shape and reference at "
+ "each grid point are reported. ";
return TerrasaurTool.super.fullDescription(options, header, footer); return TerrasaurTool.super.fullDescription(options, header, footer);
} }
/** input shape model */ /** input shape model */
private vtkPolyData globalPolyData; private vtkPolyData globalPolyData;
/** shape model in native coordinates */ /** shape model in native coordinates */
@@ -169,7 +166,10 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
// the origin of the local coordinate system, in global coordinates // the origin of the local coordinate system, in global coordinates
private enum ORIGIN { private enum ORIGIN {
MIN_HEIGHT, MAX_HEIGHT, CUSTOM, DEFAULT MIN_HEIGHT,
MAX_HEIGHT,
CUSTOM,
DEFAULT
} }
public Vector3D nativeToLocal(Vector3D nativeIJK) { public Vector3D nativeToLocal(Vector3D nativeIJK) {
@@ -211,8 +211,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
Path2D.Double outline = new Path2D.Double(); Path2D.Double outline = new Path2D.Double();
for (int i = 0; i < points.size(); i++) { for (int i = 0; i < points.size(); i++) {
Vector3D nativeIJK = plane.globalToLocal(points.get(i)); Vector3D nativeIJK = plane.globalToLocal(points.get(i));
Vector3D localIJK = Vector3D localIJK = nativeToLocal.getKey().applyTo(nativeIJK.subtract(nativeToLocal.getValue()));
nativeToLocal.getKey().applyTo(nativeIJK.subtract(nativeToLocal.getValue()));
if (i == 0) { if (i == 0) {
outline.moveTo(localIJK.getX(), localIJK.getY()); outline.moveTo(localIJK.getX(), localIJK.getY());
} else { } else {
@@ -251,11 +250,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
direction[2] = -1; direction[2] = -1;
cellID = nativeSBM.computeRayIntersection(origin, direction, intersect); cellID = nativeSBM.computeRayIntersection(origin, direction, intersect);
} }
if (cellID >= 0) if (cellID >= 0) height = direction[2] * new Vector3D(origin).distance(new Vector3D(intersect));
height = direction[2] * new Vector3D(origin).distance(new Vector3D(intersect));
if (Double.isNaN(height)) if (Double.isNaN(height)) return Double.NaN;
return Double.NaN;
return height; return height;
} }
@@ -289,8 +286,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
refHeight = referenceSurface.value(x, y); refHeight = referenceSurface.value(x, y);
} }
if (Double.isNaN(refHeight)) if (Double.isNaN(refHeight)) return Double.NaN;
return Double.NaN;
return refHeight; return refHeight;
} }
@@ -308,12 +304,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
for (double x = 0; x <= gridHalfExtent; x += gridSpacing) { for (double x = 0; x <= gridHalfExtent; x += gridSpacing) {
for (double y = 0; y <= gridHalfExtent; y += gridSpacing) { for (double y = 0; y <= gridHalfExtent; y += gridSpacing) {
localGrid.add(new Vector3D(x, y, 0)); localGrid.add(new Vector3D(x, y, 0));
if (y != 0) if (y != 0) localGrid.add(new Vector3D(x, -y, 0));
localGrid.add(new Vector3D(x, -y, 0)); if (x != 0) localGrid.add(new Vector3D(-x, y, 0));
if (x != 0) if (x != 0 && y != 0) localGrid.add(new Vector3D(-x, -y, 0));
localGrid.add(new Vector3D(-x, y, 0));
if (x != 0 && y != 0)
localGrid.add(new Vector3D(-x, -y, 0));
} }
} }
this.gridSpacing = gridSpacing; this.gridSpacing = gridSpacing;
@@ -326,17 +319,13 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
GridPoint gp = new GridPoint(localPoint); GridPoint gp = new GridPoint(localPoint);
gridPoints.add(gp); gridPoints.add(gp);
if (Double.isFinite(gp.height)) { if (Double.isFinite(gp.height)) {
if (highGridPoint == null || highGridPoint.height < gp.height) if (highGridPoint == null || highGridPoint.height < gp.height) highGridPoint = gp;
highGridPoint = gp; if (lowGridPoint == null || lowGridPoint.height > gp.height) lowGridPoint = gp;
if (lowGridPoint == null || lowGridPoint.height > gp.height)
lowGridPoint = gp;
} }
} }
if (highGridPoint != null) if (highGridPoint != null) highPoint = highGridPoint.globalIJK;
highPoint = highGridPoint.globalIJK; if (lowGridPoint != null) lowPoint = lowGridPoint.globalIJK;
if (lowGridPoint != null)
lowPoint = lowGridPoint.globalIJK;
return gridPoints; return gridPoints;
} }
@@ -374,8 +363,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
// flip the plane // flip the plane
Pair<Rotation, Vector3D> transform = plane.getTransform(); Pair<Rotation, Vector3D> transform = plane.getTransform();
Vector3D planeNormal = transform.getKey().applyInverseTo(referenceNormal); Vector3D planeNormal = transform.getKey().applyInverseTo(referenceNormal);
if (planeNormal.dotProduct(referenceNormal) < 0) if (planeNormal.dotProduct(referenceNormal) < 0) plane = plane.reverseNormal();
plane = plane.reverseNormal();
// create the SmallBodyModel for the shape to evaluate // create the SmallBodyModel for the shape to evaluate
vtkPolyData nativePolyData = new vtkPolyData(); vtkPolyData nativePolyData = new vtkPolyData();
@@ -431,13 +419,19 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
Vector3D translateNativeToLocalInGlobalCoordinates = Vector3D translateNativeToLocalInGlobalCoordinates =
localOriginInGlobalCoordinates.subtract(nativeOriginInGlobalCoordinates); localOriginInGlobalCoordinates.subtract(nativeOriginInGlobalCoordinates);
// TODO: check that the Z component is zero (it should be?) // TODO: check that the Z component is zero (it should be?)
translateNativeToLocal = translateNativeToLocal = globalToLocalTransform.getKey().applyTo(translateNativeToLocalInGlobalCoordinates);
globalToLocalTransform.getKey().applyTo(translateNativeToLocalInGlobalCoordinates);
} }
Rotation rotateNativeToLocal = Rotation rotateNativeToLocal = MathConversions.toRotation(new RotationMatrixIJK(
MathConversions.toRotation(new RotationMatrixIJK(iRow.getX(), jRow.getX(), kRow.getX(), iRow.getX(),
iRow.getY(), jRow.getY(), kRow.getY(), iRow.getZ(), jRow.getZ(), kRow.getZ())); jRow.getX(),
kRow.getX(),
iRow.getY(),
jRow.getY(),
kRow.getY(),
iRow.getZ(),
jRow.getZ(),
kRow.getZ()));
this.nativeToLocal = new AbstractMap.SimpleEntry<>(rotateNativeToLocal, translateNativeToLocal); this.nativeToLocal = new AbstractMap.SimpleEntry<>(rotateNativeToLocal, translateNativeToLocal);
} }
@@ -454,8 +448,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
sb.append("# Local X and Y are grid coordinates in the local reference frame\n"); sb.append("# Local X and Y are grid coordinates in the local reference frame\n");
sb.append("# Angle is measured from the local X axis, in degrees\n"); sb.append("# Angle is measured from the local X axis, in degrees\n");
sb.append("# ROI flag is 1 if point is in the region of interest, 0 if not\n"); sb.append("# ROI flag is 1 if point is in the region of interest, 0 if not\n");
sb.append("# Global X, Y, and Z are the local grid points in the global " sb.append("# Global X, Y, and Z are the local grid points in the global " + " (input) reference system\n");
+ " (input) reference system\n");
sb.append("# Reference Height is the height of the reference model (or fit surface) above " sb.append("# Reference Height is the height of the reference model (or fit surface) above "
+ "the local grid plane\n"); + "the local grid plane\n");
sb.append("# Model Height is the height of the shape model above the local grid plane. " sb.append("# Model Height is the height of the shape model above the local grid plane. "
@@ -476,7 +469,6 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
return sb.toString(); return sb.toString();
} }
/** /**
* The header for the sector CSV file. Each line begins with a # * The header for the sector CSV file. Each line begins with a #
* *
@@ -507,8 +499,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
sb.append(String.format("%f, ", gp.localIJK.getY())); sb.append(String.format("%f, ", gp.localIJK.getY()));
double angle = Math.toDegrees(Math.atan2(gp.localIJK.getY(), gp.localIJK.getX())); double angle = Math.toDegrees(Math.atan2(gp.localIJK.getY(), gp.localIJK.getX()));
if (angle < 0) if (angle < 0) angle += 360;
angle += 360;
sb.append(String.format("%f, ", angle)); sb.append(String.format("%f, ", angle));
sb.append(String.format("%d, ", isInsideROI(gp.localIJK) ? 1 : 0)); sb.append(String.format("%d, ", isInsideROI(gp.localIJK) ? 1 : 0));
@@ -535,8 +526,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
Point2D thisPoint = new Point2D.Double(localIJ.getX(), localIJ.getY()); Point2D thisPoint = new Point2D.Double(localIJ.getX(), localIJ.getY());
boolean insideROI = roiOuter == null || roiOuter.contains(thisPoint); boolean insideROI = roiOuter == null || roiOuter.contains(thisPoint);
if (roiInner != null) { if (roiInner != null) {
if (insideROI && roiInner.contains(thisPoint)) if (insideROI && roiInner.contains(thisPoint)) insideROI = false;
insideROI = false;
} }
return insideROI; return insideROI;
} }
@@ -549,9 +539,11 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
* @param sectorsMap grid points within each sector * @param sectorsMap grid points within each sector
* @param vtkFile file to write * @param vtkFile file to write
*/ */
private void writeReferenceVTK(Collection<GridPoint> gridPointsList, private void writeReferenceVTK(
Collection<GridPoint> gridPointsList,
Map<Integer, Collection<GridPoint>> profilesMap, Map<Integer, Collection<GridPoint>> profilesMap,
Map<Integer, Collection<GridPoint>> sectorsMap, String vtkFile) { Map<Integer, Collection<GridPoint>> sectorsMap,
String vtkFile) {
Map<Vector3D, Boolean> roiMap = new HashMap<>(); Map<Vector3D, Boolean> roiMap = new HashMap<>();
Map<Vector3D, Integer> profileMap = new HashMap<>(); Map<Vector3D, Integer> profileMap = new HashMap<>();
@@ -559,8 +551,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
for (GridPoint gp : gridPointsList) { for (GridPoint gp : gridPointsList) {
Vector3D localIJK = gp.localIJK; Vector3D localIJK = gp.localIJK;
Vector3D nativeIJK = Vector3D nativeIJK = nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
Vector3D globalIJK = plane.localToGlobal(nativeIJK); Vector3D globalIJK = plane.localToGlobal(nativeIJK);
roiMap.put(globalIJK, isInsideROI(gp.localIJK)); roiMap.put(globalIJK, isInsideROI(gp.localIJK));
profileMap.put(globalIJK, 0); profileMap.put(globalIJK, 0);
@@ -639,8 +630,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(csvFile)) { try (PrintWriter pw = new PrintWriter(csvFile)) {
pw.println(getHeader(header)); pw.println(getHeader(header));
for (GridPoint gp : gridPoints) for (GridPoint gp : gridPoints) pw.println(toCSV(gp));
pw.println(toCSV(gp));
} catch (FileNotFoundException e) { } catch (FileNotFoundException e) {
logger.warn("Can't write " + csvFile); logger.warn("Can't write " + csvFile);
logger.warn(e.getLocalizedMessage()); logger.warn(e.getLocalizedMessage());
@@ -654,12 +644,11 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
* @param header file header * @param header file header
* @param outputBasename CSV file to write * @param outputBasename CSV file to write
*/ */
private Map<Integer, Collection<GridPoint>> writeProfileCSV(Collection<GridPoint> gridPoints, private Map<Integer, Collection<GridPoint>> writeProfileCSV(
String header, String outputBasename) { Collection<GridPoint> gridPoints, String header, String outputBasename) {
Map<Integer, Collection<GridPoint>> profileMap = new HashMap<>(); Map<Integer, Collection<GridPoint>> profileMap = new HashMap<>();
if (numProfiles == 0) if (numProfiles == 0) return profileMap;
return profileMap;
// sort grid points into radial bins // sort grid points into radial bins
NavigableMap<Integer, Set<GridPoint>> radialMap = new TreeMap<>(); NavigableMap<Integer, Set<GridPoint>> radialMap = new TreeMap<>();
@@ -677,8 +666,8 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
profileMap.put(i, profileGridPoints); profileMap.put(i, profileGridPoints);
double angle = deltaAngle * i; double angle = deltaAngle * i;
String csvFile = String.format("%s_profile_%03d.csv", outputBasename, String csvFile =
(int) Math.round(Math.toDegrees(angle))); String.format("%s_profile_%03d.csv", outputBasename, (int) Math.round(Math.toDegrees(angle)));
try (PrintWriter pw = new PrintWriter(csvFile)) { try (PrintWriter pw = new PrintWriter(csvFile)) {
pw.println(getHeader(header)); pw.println(getHeader(header));
@@ -693,11 +682,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
// sort points in this radial bin by angular distance from profile angle // sort points in this radial bin by angular distance from profile angle
NavigableSet<GridPoint> sortedByAngle = new TreeSet<>((o1, o2) -> { NavigableSet<GridPoint> sortedByAngle = new TreeSet<>((o1, o2) -> {
double angle1 = Math.atan2(o1.localIJK.getY(), o1.localIJK.getX()); double angle1 = Math.atan2(o1.localIJK.getY(), o1.localIJK.getX());
if (angle1 < 0) if (angle1 < 0) angle1 += 2 * Math.PI;
angle1 += 2 * Math.PI;
double angle2 = Math.atan2(o2.localIJK.getY(), o2.localIJK.getX()); double angle2 = Math.atan2(o2.localIJK.getY(), o2.localIJK.getX());
if (angle2 < 0) if (angle2 < 0) angle2 += 2 * Math.PI;
angle2 += 2 * Math.PI;
return Double.compare(Math.abs(angle1 - angle), Math.abs(angle2 - angle)); return Double.compare(Math.abs(angle1 - angle), Math.abs(angle2 - angle));
}); });
@@ -705,14 +692,12 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
pw.println(toCSV(sortedByAngle.first())); pw.println(toCSV(sortedByAngle.first()));
GridPoint thisPoint = sortedByAngle.first(); GridPoint thisPoint = sortedByAngle.first();
if (Double.isFinite(thisPoint.differentialHeight)) if (Double.isFinite(thisPoint.differentialHeight)) profileGridPoints.add(thisPoint);
profileGridPoints.add(thisPoint);
} }
} catch (FileNotFoundException e) { } catch (FileNotFoundException e) {
logger.warn("Can't write {}", csvFile); logger.warn("Can't write {}", csvFile);
logger.warn(e.getLocalizedMessage()); logger.warn(e.getLocalizedMessage());
} }
} }
return profileMap; return profileMap;
@@ -725,14 +710,13 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
* @param header file header * @param header file header
* @param outputBasename CSV file to write * @param outputBasename CSV file to write
*/ */
private Map<Integer, Collection<GridPoint>> writeSectorCSV(Collection<GridPoint> gridPoints, private Map<Integer, Collection<GridPoint>> writeSectorCSV(
String header, String outputBasename) { Collection<GridPoint> gridPoints, String header, String outputBasename) {
// grid points in each sector // grid points in each sector
Map<Integer, Collection<GridPoint>> sectorMap = new HashMap<>(); Map<Integer, Collection<GridPoint>> sectorMap = new HashMap<>();
if (numProfiles == 0) if (numProfiles == 0) return sectorMap;
return sectorMap;
String csvFile = outputBasename + "_sector.csv"; String csvFile = outputBasename + "_sector.csv";
@@ -749,8 +733,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
for (GridPoint gp : gridPoints) { for (GridPoint gp : gridPoints) {
Vector3D localIJK = gp.localIJK; Vector3D localIJK = gp.localIJK;
double azimuth = Math.atan2(localIJK.getY(), localIJK.getX()); double azimuth = Math.atan2(localIJK.getY(), localIJK.getX());
if (azimuth < 0) if (azimuth < 0) azimuth += 2 * Math.PI;
azimuth += 2 * Math.PI;
double key = aboveMap.floorKey(azimuth); double key = aboveMap.floorKey(azimuth);
int sector = (int) (key / deltaAngle); int sector = (int) (key / deltaAngle);
@@ -788,7 +771,6 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
} }
return sectorMap; return sectorMap;
} }
private class GridPoint implements Comparable<GridPoint> { private class GridPoint implements Comparable<GridPoint> {
@@ -805,8 +787,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
*/ */
public GridPoint(Vector3D xy) { public GridPoint(Vector3D xy) {
this.localIJK = xy; this.localIJK = xy;
Vector3D nativeIJK = Vector3D nativeIJK = nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
globalIJK = plane.localToGlobal(nativeIJK); globalIJK = plane.localToGlobal(nativeIJK);
referenceHeight = getRefHeight(nativeIJK.getX(), nativeIJK.getY()); referenceHeight = getRefHeight(nativeIJK.getX(), nativeIJK.getY());
height = getHeight(nativeIJK.getX(), nativeIJK.getY()); height = getHeight(nativeIJK.getX(), nativeIJK.getY());
@@ -819,8 +800,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
@Override @Override
public int compareTo(GridPoint o) { public int compareTo(GridPoint o) {
int compare = Double.compare(localIJK.getX(), o.localIJK.getX()); int compare = Double.compare(localIJK.getX(), o.localIJK.getX());
if (compare == 0) if (compare == 0) compare = Double.compare(localIJK.getY(), o.localIJK.getY());
compare = Double.compare(localIJK.getY(), o.localIJK.getY());
return compare; return compare;
} }
} }
@@ -840,8 +820,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
} else { } else {
List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset()); List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset());
for (String line : lines) { for (String line : lines) {
if (line.trim().isEmpty() || line.trim().startsWith("#")) if (line.trim().isEmpty() || line.trim().startsWith("#")) continue;
continue;
SBMTStructure structure = SBMTStructure.fromString(line); SBMTStructure structure = SBMTStructure.fromString(line);
points.add(structure.centerXYZ()); points.add(structure.centerXYZ());
} }
@@ -851,32 +830,46 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
} }
return points; return points;
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("gridExtent").required().hasArg().desc( options.addOption(Option.builder("gridExtent")
.required()
.hasArg()
.desc(
"Required. Size of local grid, in same units as shape model and reference surface. Grid is assumed to be square.") "Required. Size of local grid, in same units as shape model and reference surface. Grid is assumed to be square.")
.build()); .build());
options.addOption(Option.builder("gridSpacing").required().hasArg() options.addOption(Option.builder("gridSpacing")
.desc( .required()
"Required. Spacing of local grid, in same units as shape model and reference surface.") .hasArg()
.desc("Required. Spacing of local grid, in same units as shape model and reference surface.")
.build()); .build());
options.addOption(Option.builder("logFile").hasArg() options.addOption(Option.builder("logFile")
.desc("If present, save screen output to log file.").build()); .hasArg()
options.addOption(Option.builder("logLevel").hasArg() .desc("If present, save screen output to log file.")
.build());
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are " .desc("If present, print messages above selected priority. Valid values are "
+ "ALL, OFF, SEVERE, WARNING, INFO, CONFIG, FINE, FINER, or FINEST. Default is INFO.") + "ALL, OFF, SEVERE, WARNING, INFO, CONFIG, FINE, FINER, or FINEST. Default is INFO.")
.build()); .build());
options.addOption(Option.builder("numProfiles").hasArg().desc( options.addOption(Option.builder("numProfiles")
"Number of radial profiles to create. Profiles are evenly spaced in degrees and evaluated " .hasArg()
.desc("Number of radial profiles to create. Profiles are evenly spaced in degrees and evaluated "
+ "at intervals of gridSpacing in the radial direction.") + "at intervals of gridSpacing in the radial direction.")
.build()); .build());
options.addOption(Option.builder("origin").hasArg() options.addOption(Option.builder("origin")
.desc("If present, set origin of local coordinate system. " .hasArg()
.desc(
"If present, set origin of local coordinate system. "
+ "Options are MAX_HEIGHT (set to maximum elevation of the shape model), " + "Options are MAX_HEIGHT (set to maximum elevation of the shape model), "
+ "MIN_HEIGHT (set to minimum elevation of the shape model), " + "MIN_HEIGHT (set to minimum elevation of the shape model), "
+ "or a three element vector specifying the desired origin, comma separated, no spaces (e.g. 11.45,-45.34,0.932).") + "or a three element vector specifying the desired origin, comma separated, no spaces (e.g. 11.45,-45.34,0.932).")
.build()); .build());
options.addOption(Option.builder("output").hasArg().required().desc( options.addOption(Option.builder("output")
.hasArg()
.required()
.desc(
"Basename of output files. Files will be named ${output}_grid.csv for the grid, ${output}_sector.csv for the sectors, " "Basename of output files. Files will be named ${output}_grid.csv for the grid, ${output}_sector.csv for the sectors, "
+ "and ${output}_profile_${degrees}.csv for profiles.") + "and ${output}_profile_${degrees}.csv for profiles.")
.build()); .build());
@@ -884,26 +877,38 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
.desc("Specify +Z direction of local coordinate system to be in the radial " .desc("Specify +Z direction of local coordinate system to be in the radial "
+ "direction. Default is to align local +Z along global +Z.") + "direction. Default is to align local +Z along global +Z.")
.build()); .build());
options.addOption(Option.builder("referenceList").hasArg().desc( options.addOption(Option.builder("referenceList")
"File containing reference points. If the file extension is .vtk it is read as a VTK file, " .hasArg()
.desc("File containing reference points. If the file extension is .vtk it is read as a VTK file, "
+ "otherwise it is assumed to be an SBMT structure file.") + "otherwise it is assumed to be an SBMT structure file.")
.build()); .build());
options.addOption(Option.builder("referenceShape").hasArg().desc("Reference shape.").build()); options.addOption(Option.builder("referenceShape")
options.addOption(Option.builder("referenceVTK").hasArg() .hasArg()
.desc("Reference shape.")
.build());
options.addOption(Option.builder("referenceVTK")
.hasArg()
.desc("If present, write out a VTK file with the reference surface at each grid point. " .desc("If present, write out a VTK file with the reference surface at each grid point. "
+ "If an ROI is defined color points inside/outside the boundaries.") + "If an ROI is defined color points inside/outside the boundaries.")
.build()); .build());
options.addOption(Option.builder("roiInner").hasArg().desc( options.addOption(Option.builder("roiInner")
.hasArg()
.desc(
"Flag points closer to the origin than this as outside the ROI. Supported formats are the same as referenceList.") "Flag points closer to the origin than this as outside the ROI. Supported formats are the same as referenceList.")
.build()); .build());
options.addOption(Option.builder("roiOuter").hasArg().desc( options.addOption(Option.builder("roiOuter")
.hasArg()
.desc(
"Flag points closer to the origin than this as outside the ROI. Supported formats are the same as referenceList.") "Flag points closer to the origin than this as outside the ROI. Supported formats are the same as referenceList.")
.build()); .build());
options.addOption(Option.builder("shapeModel").hasArg().required() options.addOption(Option.builder("shapeModel")
.desc("Shape model for volume computation.").build()); return options; .hasArg()
.required()
.desc("Shape model for volume computation.")
.build());
return options;
} }
public static void main(String[] args) { public static void main(String[] args) {
TerrasaurTool defaultOBJ = new DifferentialVolumeEstimator(); TerrasaurTool defaultOBJ = new DifferentialVolumeEstimator();
@@ -917,7 +922,11 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
StringBuilder header = new StringBuilder(); StringBuilder header = new StringBuilder();
header.append("# ").append(new Date()).append("\n"); header.append("# ").append(new Date()).append("\n");
header.append("# ").append(defaultOBJ.getClass().getSimpleName()).append(" [").append(AppVersion.getVersionString()).append("]\n"); header.append("# ")
.append(defaultOBJ.getClass().getSimpleName())
.append(" [")
.append(AppVersion.getVersionString())
.append("]\n");
header.append("# ").append(startupMessages.get(MessageLabel.ARGUMENTS)).append("\n"); header.append("# ").append(startupMessages.get(MessageLabel.ARGUMENTS)).append("\n");
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();
@@ -929,8 +938,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
String dirName = FilenameUtils.getFullPath(outputBasename); String dirName = FilenameUtils.getFullPath(outputBasename);
if (!dirName.trim().isEmpty()) { if (!dirName.trim().isEmpty()) {
File dir = new File(dirName); File dir = new File(dirName);
if (!dir.exists()) if (!dir.exists()) dir.mkdirs();
dir.mkdirs();
} }
vtkPolyData polyData = null; vtkPolyData polyData = null;
@@ -948,8 +956,10 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
if (originString.contains(",")) { if (originString.contains(",")) {
String[] parts = originString.split(","); String[] parts = originString.split(",");
if (parts.length == 3) { if (parts.length == 3) {
localOrigin = new Vector3D(Double.parseDouble(parts[0].trim()), localOrigin = new Vector3D(
Double.parseDouble(parts[1].trim()), Double.parseDouble(parts[2].trim())); Double.parseDouble(parts[0].trim()),
Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()));
originType = ORIGIN.CUSTOM; originType = ORIGIN.CUSTOM;
} }
} else { } else {
@@ -958,11 +968,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
} }
DifferentialVolumeEstimator app = new DifferentialVolumeEstimator(polyData); DifferentialVolumeEstimator app = new DifferentialVolumeEstimator(polyData);
if (cl.hasOption("numProfiles")) if (cl.hasOption("numProfiles")) app.setNumProfiles(Integer.parseInt(cl.getOptionValue("numProfiles")));
app.setNumProfiles(Integer.parseInt(cl.getOptionValue("numProfiles")));
if (cl.hasOption("radialUp")) if (cl.hasOption("radialUp")) app.setRadialUp(true);
app.setRadialUp(true);
if (cl.hasOption("referenceShape")) { if (cl.hasOption("referenceShape")) {
try { try {
@@ -994,11 +1002,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
break; break;
} }
if (cl.hasOption("roiInner")) if (cl.hasOption("roiInner")) app.setInnerROI(cl.getOptionValue("roiInner"));
app.setInnerROI(cl.getOptionValue("roiInner"));
if (cl.hasOption("roiOuter")) if (cl.hasOption("roiOuter")) app.setOuterROI(cl.getOptionValue("roiOuter"));
app.setOuterROI(cl.getOptionValue("roiOuter"));
NavigableSet<GridPoint> gridPoints = app.createGrid(gridHalfExtent, gridSpacing); NavigableSet<GridPoint> gridPoints = app.createGrid(gridHalfExtent, gridSpacing);

View File

@@ -79,14 +79,12 @@ public class DumpConfig implements TerrasaurTool {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock); PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock);
PropertiesConfigurationLayout layout = config.getLayout(); PropertiesConfigurationLayout layout = config.getLayout();
String now = String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault()) .withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC) .withZone(ZoneOffset.UTC)
.format(Instant.now()); .format(Instant.now());
layout.setHeaderComment( layout.setHeaderComment(
String.format( String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw); config.write(pw);
} catch (ConfigurationException | IOException e) { } catch (ConfigurationException | IOException e) {

View File

@@ -0,0 +1,261 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.io.PrintWriter;
import java.util.Arrays;
import java.util.Map;
import java.util.NavigableSet;
import java.util.TreeSet;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.math3.geometry.euclidean.threed.SphericalCoordinates;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.immutables.value.Value;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.CellInfo;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataStatistics;
import terrasaur.utils.PolyDataUtil;
import vtk.vtkIdList;
import vtk.vtkOBBTree;
import vtk.vtkPolyData;
public class FacetInfo implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private FacetInfo() {}
@Override
public String shortDescription() {
return "Print info about a facet.";
}
@Override
public String fullDescription(Options options) {
String header = "Prints information about facet(s).";
String footer =
"""
This tool prints out facet center, normal, angle between center and
normal, and other information about the specified facet(s).""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private vtkPolyData polyData;
private vtkOBBTree searchTree;
private Vector3D origin;
private FacetInfo(vtkPolyData polyData) {
this.polyData = polyData;
PolyDataStatistics stats = new PolyDataStatistics(polyData);
origin = new Vector3D(stats.getCentroid());
logger.info("Origin is at {}", origin);
logger.info("Creating search tree");
searchTree = new vtkOBBTree();
searchTree.SetDataSet(polyData);
searchTree.SetTolerance(1e-12);
searchTree.BuildLocator();
}
/**
* @param cellId id of this cell
* @return Set of neighboring cells (ones which share a vertex with this one)
*/
private NavigableSet<Long> neighbors(long cellId) {
NavigableSet<Long> neighborCellIds = new TreeSet<>();
vtkIdList vertexIdlist = new vtkIdList();
CellInfo.getCellInfo(polyData, cellId, vertexIdlist);
vtkIdList facetIdlist = new vtkIdList();
for (long i = 0; i < vertexIdlist.GetNumberOfIds(); i++) {
long vertexId = vertexIdlist.GetId(i);
polyData.GetPointCells(vertexId, facetIdlist);
}
for (long i = 0; i < facetIdlist.GetNumberOfIds(); i++) {
long id = facetIdlist.GetId(i);
if (id == cellId) continue;
neighborCellIds.add(id);
}
return neighborCellIds;
}
@Value.Immutable
public abstract static class FacetInfoLine {
public abstract long index();
public abstract Vector3D radius();
public abstract Vector3D normal();
/**
* @return facets between this and origin
*/
public abstract NavigableSet<Long> interiorIntersections();
/**
* @return facets between this and infinity
*/
public abstract NavigableSet<Long> exteriorIntersections();
public static String getHeader() {
return "# Index, "
+ "Center Lat (deg), "
+ "Center Lon (deg), "
+ "Radius, "
+ "Radial Vector, "
+ "Normal Vector, "
+ "Angle between radial and normal (deg), "
+ "facets between this and origin, "
+ "facets between this and infinity";
}
public String toCSV() {
SphericalCoordinates spc = new SphericalCoordinates(radius());
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d, ", index()));
sb.append(String.format("%.4f, ", 90 - Math.toDegrees(spc.getPhi())));
sb.append(String.format("%.4f, ", Math.toDegrees(spc.getTheta())));
sb.append(String.format("%.6f, ", spc.getR()));
sb.append(String.format("%.6f %.6f %.6f, ", radius().getX(), radius().getY(), radius().getZ()));
sb.append(String.format("%.6f %.6f %.6f, ", normal().getX(), normal().getY(), normal().getZ()));
sb.append(String.format("%.3f, ", Math.toDegrees(Vector3D.angle(radius(), normal()))));
sb.append(String.format("%d", interiorIntersections().size()));
if (!interiorIntersections().isEmpty()) {
for (long id : interiorIntersections()) sb.append(String.format(" %d", id));
}
sb.append(", ");
sb.append(String.format("%d", exteriorIntersections().size()));
if (!exteriorIntersections().isEmpty()) {
for (long id : exteriorIntersections()) sb.append(String.format(" %d", id));
}
return sb.toString();
}
}
private FacetInfoLine getFacetInfoLine(long cellId) {
CellInfo ci = CellInfo.getCellInfo(polyData, cellId, new vtkIdList());
vtkIdList cellIds = new vtkIdList();
searchTree.IntersectWithLine(origin.toArray(), ci.center().toArray(), null, cellIds);
// count up all crossings of the surface between the origin and the facet.
NavigableSet<Long> insideIds = new TreeSet<>();
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == cellId) continue;
insideIds.add(cellIds.GetId(j));
}
Vector3D infinity = ci.center().scalarMultiply(1e9);
cellIds = new vtkIdList();
searchTree.IntersectWithLine(infinity.toArray(), ci.center().toArray(), null, cellIds);
// count up all crossings of the surface between the infinity and the facet.
NavigableSet<Long> outsideIds = new TreeSet<>();
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == cellId) continue;
outsideIds.add(cellIds.GetId(j));
}
return ImmutableFacetInfoLine.builder()
.index(cellId)
.radius(ci.center())
.normal(ci.normal())
.interiorIntersections(insideIds)
.exteriorIntersections(outsideIds)
.build();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("facet")
.required()
.hasArgs()
.desc("Facet(s) to query. Separate multiple indices with whitespace.")
.build());
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("CSV file to write.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new FacetInfo();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
try {
vtkPolyData polydata = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
FacetInfo app = new FacetInfo(polydata);
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
pw.println(FacetInfoLine.getHeader());
for (long cellId : Arrays.stream(cl.getOptionValues("facet"))
.mapToLong(Long::parseLong)
.toArray()) {
pw.println(app.getFacetInfoLine(cellId).toCSV());
NavigableSet<Long> neighbors = app.neighbors(cellId);
for (long neighborCellId : neighbors)
pw.println(app.getFacetInfoLine(neighborCellId).toCSV());
}
}
logger.info("Wrote {}", cl.getOptionValue("output"));
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
logger.info("Finished.");
}
}

View File

@@ -282,31 +282,19 @@ public class GetSpots implements TerrasaurTool {
Instrument instrument = new Instrument(instID); Instrument instrument = new Instrument(instID);
FOV fov = new FOV(instrument); FOV fov = new FOV(instrument);
Matrix33 instrToBodyFixed = Matrix33 instrToBodyFixed = fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight()); Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight());
StateRecord sr = StateRecord sr = new StateRecord(
new StateRecord( new Body(SC_ID_String), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET));
new Body(SC_ID_String),
tdbTime,
BodyFixed,
new AberrationCorrection("LT+S"),
new Body(TARGET));
Vector3 scposBodyFixed = sr.getPosition(); Vector3 scposBodyFixed = sr.getPosition();
PositionVector sunPos = PositionVector sunPos = new StateRecord(
new StateRecord( new Body("SUN"), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET))
new Body("SUN"),
tdbTime,
BodyFixed,
new AberrationCorrection("LT+S"),
new Body(TARGET))
.getPosition(); .getPosition();
double[] double3 = new double[3]; double[] double3 = new double[3];
long cellID = long cellID = smallBodyModel.computeRayIntersection(
smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3); scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3);
if (cellID == -1) return; // no boresight intersection if (cellID == -1) return; // no boresight intersection
@@ -344,15 +332,13 @@ public class GetSpots implements TerrasaurTool {
} }
} else { } else {
// TODO: add ELLIPSE // TODO: add ELLIPSE
System.err.printf( System.err.printf("Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
"Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
System.exit(0); System.exit(0);
} }
// check all of the boundary vectors for surface intersections // check all of the boundary vectors for surface intersections
for (Vector3 vector : boundaryBodyFixed) { for (Vector3 vector : boundaryBodyFixed) {
cellID = cellID = smallBodyModel.computeRayIntersection(
smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), vector.hat().toArray(), double3); scposBodyFixed.toArray(), vector.hat().toArray(), double3);
if (cellID == -1) { if (cellID == -1) {
flag = 1; flag = 1;
@@ -377,8 +363,7 @@ public class GetSpots implements TerrasaurTool {
double emission = facetToSC.sep(facetNormal); double emission = facetToSC.sep(facetNormal);
if (emission > Math.PI / 2) continue; if (emission > Math.PI / 2) continue;
double dist = double dist = findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
if (dist < maxdist) { if (dist < maxdist) {
Vector3 facetToSun = sunPos.sub(facetCenter); Vector3 facetToSun = sunPos.sub(facetCenter);
@@ -398,8 +383,7 @@ public class GetSpots implements TerrasaurTool {
facetPlane.project(bsightIntersectVector).sub(facetCenter); facetPlane.project(bsightIntersectVector).sub(facetCenter);
// 0 = North, 90 = East // 0 = North, 90 = East
double pos = double pos = Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos; if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos;
int nCovered = 0; int nCovered = 0;
@@ -439,9 +423,7 @@ public class GetSpots implements TerrasaurTool {
} }
} }
StringBuilder output = StringBuilder output = new StringBuilder(String.format(
new StringBuilder(
String.format(
"%s %.4f %5.1f %.1f %d %.1f %.1f %.1f", "%s %.4f %5.1f %.1f %d %.1f %.1f %.1f",
sclkTime, sclkTime,
dist, dist,
@@ -575,17 +557,19 @@ public class GetSpots implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("spice").required().hasArg().desc("SPICE metakernel").build()); options.addOption(Option.builder("spice")
options.addOption(Option.builder("obj").required().hasArg().desc("Shape file").build());
options.addOption(
Option.builder("instype")
.required() .required()
.hasArg() .hasArg()
.desc( .desc("SPICE metakernel")
"one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
.build()); .build());
options.addOption( options.addOption(
Option.builder("sclk") Option.builder("obj").required().hasArg().desc("Shape file").build());
options.addOption(Option.builder("instype")
.required()
.hasArg()
.desc("one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
.build());
options.addOption(Option.builder("sclk")
.required() .required()
.hasArg() .hasArg()
.desc( .desc(
@@ -596,19 +580,15 @@ file containing sclk values for instrument observation times. All values betwee
3/605862045.24157 3/605862045.24157
END""") END""")
.build()); .build());
options.addOption( options.addOption(Option.builder("maxdist")
Option.builder("maxdist")
.required() .required()
.hasArg() .hasArg()
.desc("maximum distance of boresight from facet center in milliradians") .desc("maximum distance of boresight from facet center in milliradians")
.build()); .build());
options.addOption( options.addOption(Option.builder("all-facets")
Option.builder("all-facets") .desc("Optional. If present, entries for all facets will be output, even if there is no intersection.")
.desc(
"Optional. If present, entries for all facets will be output, even if there is no intersection.")
.build()); .build());
options.addOption( options.addOption(Option.builder("verbose")
Option.builder("verbose")
.hasArg() .hasArg()
.desc( .desc(
"Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.") "Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.")
@@ -636,8 +616,7 @@ file containing sclk values for instrument observation times. All values betwee
int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0")); int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0"));
if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1; if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1;
GetSpots gs = GetSpots gs = new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
try { try {
gs.process(); gs.process();
gs.printMap(); gs.printMap();

View File

@@ -120,13 +120,14 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
targetAccelerationJ2000 = new Vector3(); targetAccelerationJ2000 = new Vector3();
} else { } else {
double duration = t1.getTDBSeconds() - t0.getTDBSeconds(); double duration = t1.getTDBSeconds() - t0.getTDBSeconds();
observerAccelerationJ2000 = observerAccelerationJ2000 = finalObserverJ2000
finalObserverJ2000
.getVelocity() .getVelocity()
.sub(initialObserverJ2000.getVelocity()) .sub(initialObserverJ2000.getVelocity())
.scale(1. / duration); .scale(1. / duration);
targetAccelerationJ2000 = targetAccelerationJ2000 = finalTargetJ2000
finalTargetJ2000.getVelocity().sub(initialTargetJ2000.getVelocity()).scale(1. / duration); .getVelocity()
.sub(initialTargetJ2000.getVelocity())
.scale(1. / duration);
} }
} }
@@ -141,23 +142,19 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
try { try {
double delta = et.sub(t0).getMeasure(); double delta = et.sub(t0).getMeasure();
Vector3 observerPosJ2000 = Vector3 observerPosJ2000 = initialObserverJ2000
initialObserverJ2000
.getPosition() .getPosition()
.add(initialObserverJ2000.getVelocity().scale(delta)) .add(initialObserverJ2000.getVelocity().scale(delta))
.add(observerAccelerationJ2000.scale(0.5 * delta * delta)); .add(observerAccelerationJ2000.scale(0.5 * delta * delta));
Vector3 targetPosJ2000 = Vector3 targetPosJ2000 = initialTargetJ2000
initialTargetJ2000
.getPosition() .getPosition()
.add(initialTargetJ2000.getVelocity().scale(delta)) .add(initialTargetJ2000.getVelocity().scale(delta))
.add(targetAccelerationJ2000.scale(0.5 * delta * delta)); .add(targetAccelerationJ2000.scale(0.5 * delta * delta));
Vector3 scPosJ2000 = observerPosJ2000.sub(targetPosJ2000); Vector3 scPosJ2000 = observerPosJ2000.sub(targetPosJ2000);
Vector3 observerVelJ2000 = Vector3 observerVelJ2000 = initialObserverJ2000.getVelocity().add(observerAccelerationJ2000.scale(delta));
initialObserverJ2000.getVelocity().add(observerAccelerationJ2000.scale(delta)); Vector3 targetVelJ2000 = initialTargetJ2000.getVelocity().add(targetAccelerationJ2000.scale(delta));
Vector3 targetVelJ2000 =
initialTargetJ2000.getVelocity().add(targetAccelerationJ2000.scale(delta));
Vector3 scVelJ2000 = observerVelJ2000.sub(targetVelJ2000); Vector3 scVelJ2000 = observerVelJ2000.sub(targetVelJ2000);
StateVector scStateJ2000 = new StateVector(scPosJ2000, scVelJ2000); StateVector scStateJ2000 = new StateVector(scPosJ2000, scVelJ2000);
@@ -174,7 +171,8 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
rayBundlePolyData.SetLines(rayBundleCells); rayBundlePolyData.SetLines(rayBundleCells);
} }
long id0 = rayBundlePoints.InsertNextPoint(lastState.getPosition().toArray()); long id0 = rayBundlePoints.InsertNextPoint(lastState.getPosition().toArray());
long id1 = rayBundlePoints.InsertNextPoint(scStateBodyFixed.getPosition().toArray()); long id1 = rayBundlePoints.InsertNextPoint(
scStateBodyFixed.getPosition().toArray());
lastState = scStateBodyFixed; lastState = scStateBodyFixed;
vtkLine line = new vtkLine(); vtkLine line = new vtkLine();
@@ -216,19 +214,19 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
StateVector scStateBodyFixed = getStateBodyFixed(et); StateVector scStateBodyFixed = getStateBodyFixed(et);
Vector3 closestPoint = Vector3 closestPoint = new Vector3(
new Vector3(sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray())); sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray()));
Vector3 toSurface = scStateBodyFixed.getPosition().sub(closestPoint); Vector3 toSurface = scStateBodyFixed.getPosition().sub(closestPoint);
double altitude = toSurface.norm(); double altitude = toSurface.norm();
// time it takes to get halfway to the surface // time it takes to get halfway to the surface
TDBDuration delta = new TDBDuration(altitude / (2 * scStateBodyFixed.getVelocity().norm())); TDBDuration delta = new TDBDuration(
altitude / (2 * scStateBodyFixed.getVelocity().norm()));
boolean keepGoing = true; boolean keepGoing = true;
while (keepGoing) { while (keepGoing) {
LatitudinalCoordinates lc = new LatitudinalCoordinates(scStateBodyFixed.getPosition()); LatitudinalCoordinates lc = new LatitudinalCoordinates(scStateBodyFixed.getPosition());
records.add( records.add(new ImpactRecord(
new ImpactRecord(
et, et,
scStateBodyFixed, scStateBodyFixed,
new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude()))); new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude())));
@@ -237,33 +235,32 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
scStateBodyFixed = getStateBodyFixed(et); scStateBodyFixed = getStateBodyFixed(et);
closestPoint = new Vector3(sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray())); closestPoint = new Vector3(
sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray()));
toSurface = scStateBodyFixed.getPosition().sub(closestPoint); toSurface = scStateBodyFixed.getPosition().sub(closestPoint);
altitude = toSurface.norm(); altitude = toSurface.norm();
// check that we're still moving towards the target // check that we're still moving towards the target
if (scStateBodyFixed.getPosition().dot(scStateBodyFixed.getVelocity()) > 0) { if (scStateBodyFixed.getPosition().dot(scStateBodyFixed.getVelocity()) > 0) {
logger.warn( logger.warn(
"Stopping at {}; passed closest approach to the body center.", "Stopping at {}; passed closest approach to the body center.", et.toUTCString("ISOC", 3));
et.toUTCString("ISOC", 3));
keepGoing = false; keepGoing = false;
} }
if (altitude > finalHeight) { if (altitude > finalHeight) {
delta = new TDBDuration(toSurface.norm() / (2 * scStateBodyFixed.getVelocity().norm())); delta = new TDBDuration(toSurface.norm()
/ (2 * scStateBodyFixed.getVelocity().norm()));
} else if (altitude > finalStep) { } else if (altitude > finalStep) {
delta = new TDBDuration(finalStep / scStateBodyFixed.getVelocity().norm()); delta = new TDBDuration(
finalStep / scStateBodyFixed.getVelocity().norm());
} else { } else {
keepGoing = false; keepGoing = false;
} }
} }
LatitudinalCoordinates lc = new LatitudinalCoordinates(scStateBodyFixed.getPosition()); LatitudinalCoordinates lc = new LatitudinalCoordinates(scStateBodyFixed.getPosition());
records.add( records.add(new ImpactRecord(
new ImpactRecord( et, scStateBodyFixed, new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude())));
et,
scStateBodyFixed,
new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude())));
} catch (SpiceException e) { } catch (SpiceException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
@@ -293,8 +290,8 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
} }
} }
private static Vector3 correctForAberration( private static Vector3 correctForAberration(Vector3 targetLTS, Body observer, Body target, TDBTime t)
Vector3 targetLTS, Body observer, Body target, TDBTime t) throws SpiceException { throws SpiceException {
RemoveAberration ra = new RemoveAberration(target, observer); RemoveAberration ra = new RemoveAberration(target, observer);
return ra.getGeometricPosition(t, targetLTS); return ra.getGeometricPosition(t, targetLTS);
@@ -302,122 +299,94 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("date")
Option.builder("date")
.hasArgs() .hasArgs()
.desc("Initial UTC date. Required if -sumFile is not used.") .desc("Initial UTC date. Required if -sumFile is not used.")
.build()); .build());
options.addOption( options.addOption(Option.builder("finalHeight")
Option.builder("finalHeight")
.hasArg() .hasArg()
.desc("Height above surface in meters to consider \"impact\". Default is 1 meter.") .desc("Height above surface in meters to consider \"impact\". Default is 1 meter.")
.build()); .build());
options.addOption( options.addOption(Option.builder("finalStep")
Option.builder("finalStep")
.hasArg() .hasArg()
.desc( .desc("Continue printing output below finalHeight in increments of approximate finalStep "
"Continue printing output below finalHeight in increments of approximate finalStep "
+ "(in meters) until zero. Default is to stop at finalHeight.") + "(in meters) until zero. Default is to stop at finalHeight.")
.build()); .build());
options.addOption( options.addOption(Option.builder("frame")
Option.builder("frame")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of body fixed frame.") .desc("Required. Name of body fixed frame.")
.build()); .build());
options.addOption( options.addOption(Option.builder("instrumentFrame")
Option.builder("instrumentFrame")
.hasArg() .hasArg()
.desc( .desc("SPICE ID for the camera reference frame. Required if -outputTransform "
"SPICE ID for the camera reference frame. Required if -outputTransform "
+ "AND -sumFile are used.") + "AND -sumFile are used.")
.build()); .build());
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("objFile")
Option.builder("objFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of OBJ shape file.") .desc("Required. Name of OBJ shape file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("observer")
Option.builder("observer")
.required() .required()
.hasArg() .hasArg()
.desc("Required. SPICE ID for the impactor.") .desc("Required. SPICE ID for the impactor.")
.build()); .build());
options.addOption( options.addOption(Option.builder("observerFrame")
Option.builder("observerFrame")
.hasArg() .hasArg()
.desc( .desc("SPICE ID for the impactor's reference frame. Required if -outputTransform is used.")
"SPICE ID for the impactor's reference frame. Required if -outputTransform is used.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outputTransform")
Option.builder("outputTransform")
.hasArg() .hasArg()
.desc( .desc("If present, write out a transform file that can be used by TransformShape to place "
"If present, write out a transform file that can be used by TransformShape to place "
+ "coordinates in the spacecraft frame in the body fixed frame. The rotation " + "coordinates in the spacecraft frame in the body fixed frame. The rotation "
+ " is evaluated at the sumfile time. The translation is evaluated at the impact time. " + " is evaluated at the sumfile time. The translation is evaluated at the impact time. "
+ "Requires -observerFrame option.") + "Requires -observerFrame option.")
.build()); .build());
options.addOption( options.addOption(Option.builder("position")
Option.builder("position")
.hasArg() .hasArg()
.desc( .desc("Spacecraft to body vector in body fixed coordinates. Units are km. "
"Spacecraft to body vector in body fixed coordinates. Units are km. "
+ "Spacecraft is at the origin to be consistent with sumFile convention.") + "Spacecraft is at the origin to be consistent with sumFile convention.")
.build()); .build());
options.addOption( options.addOption(Option.builder("spice")
Option.builder("spice")
.required() .required()
.hasArgs() .hasArgs()
.desc( .desc("Required. SPICE metakernel file containing body fixed frame and spacecraft kernels. "
"Required. SPICE metakernel file containing body fixed frame and spacecraft kernels. "
+ "Can specify more than one kernel, separated by whitespace.") + "Can specify more than one kernel, separated by whitespace.")
.build()); .build());
options.addOption( options.addOption(Option.builder("sumFile")
Option.builder("sumFile")
.hasArg() .hasArg()
.desc( .desc("Name of sum file to read. Coordinate system is assumed to be in the body "
"Name of sum file to read. Coordinate system is assumed to be in the body "
+ "fixed frame with the spacecraft at the origin.") + "fixed frame with the spacecraft at the origin.")
.build()); .build());
options.addOption( options.addOption(Option.builder("target")
Option.builder("target")
.required() .required()
.hasArg() .hasArg()
.desc("Required. SPICE ID for the target.") .desc("Required. SPICE ID for the target.")
.build()); .build());
options.addOption( options.addOption(Option.builder("trajectory")
Option.builder("trajectory")
.hasArg() .hasArg()
.desc( .desc("If present, name of output VTK file containing trajectory in body fixed coordinates.")
"If present, name of output VTK file containing trajectory in body fixed coordinates.")
.build()); .build());
options.addOption( options.addOption(Option.builder("verbosity")
Option.builder("verbosity")
.hasArg() .hasArg()
.desc("This option does nothing! Use -logLevel instead.") .desc("This option does nothing! Use -logLevel instead.")
.build()); .build());
options.addOption( options.addOption(Option.builder("velocity")
Option.builder("velocity")
.hasArg() .hasArg()
.desc( .desc("Spacecraft velocity in J2000 relative to the body. Units are km/s. "
"Spacecraft velocity in J2000 relative to the body. Units are km/s. "
+ "If not specified, velocity is calculated using SPICE.") + "If not specified, velocity is calculated using SPICE.")
.build()); .build());
return options; return options;
@@ -447,18 +416,14 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
Body target = new Body(cl.getOptionValue("target")); Body target = new Body(cl.getOptionValue("target"));
final double finalHeight = final double finalHeight =
cl.hasOption("finalHeight") cl.hasOption("finalHeight") ? Double.parseDouble(cl.getOptionValue("finalHeight")) / 1e3 : 1e-3;
? Double.parseDouble(cl.getOptionValue("finalHeight")) / 1e3
: 1e-3;
if (finalHeight <= 0) { if (finalHeight <= 0) {
logger.warn("Argument to -finalHeight must be positive!"); logger.warn("Argument to -finalHeight must be positive!");
System.exit(0); System.exit(0);
} }
final double finalStep = final double finalStep =
cl.hasOption("finalStep") cl.hasOption("finalStep") ? Double.parseDouble(cl.getOptionValue("finalStep")) / 1e3 : Double.MAX_VALUE;
? Double.parseDouble(cl.getOptionValue("finalStep")) / 1e3
: Double.MAX_VALUE;
if (finalStep <= 0) { if (finalStep <= 0) {
logger.warn("Argument to -finalStep must be positive!"); logger.warn("Argument to -finalStep must be positive!");
System.exit(0); System.exit(0);
@@ -528,21 +493,19 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
} }
if (Math.abs(scPosBodyFixed.sub(initialPos).norm()) > 0) { if (Math.abs(scPosBodyFixed.sub(initialPos).norm()) > 0) {
logger.warn( logger.warn(String.format(
String.format(
"Warning! Spacecraft position relative to target from SPICE is %s while input position is %s", "Warning! Spacecraft position relative to target from SPICE is %s while input position is %s",
new Vector3(scPosBodyFixed), initialPos.toString())); new Vector3(scPosBodyFixed), initialPos.toString()));
logger.warn(String.format("Difference is %e km", initialPos.sub(scPosBodyFixed).norm())); logger.warn(String.format(
"Difference is %e km", initialPos.sub(scPosBodyFixed).norm()));
logger.warn("Continuing with input position"); logger.warn("Continuing with input position");
} }
Vector3 initialPosJ2000 = bodyFixed.getPositionTransformation(J2000, et).mxv(initialPos); Vector3 initialPosJ2000 = bodyFixed.getPositionTransformation(J2000, et).mxv(initialPos);
// relative to solar system barycenter in J2000 // relative to solar system barycenter in J2000
StateVector initialTargetJ2000 = StateVector initialTargetJ2000 = new StateRecord(target, et, J2000, abCorrNone, new Body(0)).getStateVector();
new StateRecord(target, et, J2000, abCorrNone, new Body(0)).getStateVector(); StateVector initialObserverJ2000 = new StateVector(
StateVector initialObserverJ2000 =
new StateVector(
initialPosJ2000.add(initialTargetJ2000.getPosition()), initialPosJ2000.add(initialTargetJ2000.getPosition()),
new StateRecord(observer, et, J2000, abCorrNone, new Body(0)).getVelocity()); new StateRecord(observer, et, J2000, abCorrNone, new Body(0)).getVelocity());
@@ -552,20 +515,17 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
for (int i = 0; i < 3; i++) tmp[i] = Double.parseDouble(parts[i].trim()); for (int i = 0; i < 3; i++) tmp[i] = Double.parseDouble(parts[i].trim());
Vector3 scVelJ2000 = new Vector3(tmp); Vector3 scVelJ2000 = new Vector3(tmp);
initialObserverJ2000 = initialObserverJ2000 = new StateVector(
new StateVector(
initialObserverJ2000.getPosition(), scVelJ2000.add(initialTargetJ2000.getVelocity())); initialObserverJ2000.getPosition(), scVelJ2000.add(initialTargetJ2000.getVelocity()));
StateRecord obs = new StateRecord(observer, et, J2000, abCorrNone, new Body(0)); StateRecord obs = new StateRecord(observer, et, J2000, abCorrNone, new Body(0));
logger.info( logger.info(String.format(
String.format(
"spacecraft velocity relative to target from SPICE at %s is %s", "spacecraft velocity relative to target from SPICE at %s is %s",
et.toUTCString("ISOC", 3), obs.getVelocity().sub(initialTargetJ2000.getVelocity()))); et.toUTCString("ISOC", 3), obs.getVelocity().sub(initialTargetJ2000.getVelocity())));
logger.info(String.format("Specified velocity is %s", scVelJ2000)); logger.info(String.format("Specified velocity is %s", scVelJ2000));
} }
ImpactLocator ifsf = ImpactLocator ifsf = new ImpactLocator(
new ImpactLocator(
J2000, J2000,
bodyFixed, bodyFixed,
sbm, sbm,
@@ -624,21 +584,21 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
double lastET = Math.min(records.last().et.getTDBSeconds(), lastTarget); double lastET = Math.min(records.last().et.getTDBSeconds(), lastTarget);
lastET = Math.min(lastET, lastObserver); lastET = Math.min(lastET, lastObserver);
TDBTime last = new TDBTime(lastET); TDBTime last = new TDBTime(lastET);
StateRecord finalObserverJ2000 = StateRecord finalObserverJ2000 = new StateRecord(observer, last, J2000, abCorrNone, new Body(0));
new StateRecord(observer, last, J2000, abCorrNone, new Body(0));
StateRecord finalTargetJ2000 = new StateRecord(target, last, J2000, abCorrNone, new Body(0)); StateRecord finalTargetJ2000 = new StateRecord(target, last, J2000, abCorrNone, new Body(0));
System.out.printf( System.out.printf(
"%s: Observer velocity relative to SSB (J2000): %s%n", "%s: Observer velocity relative to SSB (J2000): %s%n",
last.toUTCString("ISOC", 3), finalObserverJ2000.getVelocity()); last.toUTCString("ISOC", 3), finalObserverJ2000.getVelocity());
double duration = last.getTDBSeconds() - first.getTDBSeconds(); double duration = last.getTDBSeconds() - first.getTDBSeconds();
Vector3 observerAccelerationJ2000 = Vector3 observerAccelerationJ2000 = finalObserverJ2000
finalObserverJ2000
.getVelocity() .getVelocity()
.sub(initialObserverJ2000.getVelocity()) .sub(initialObserverJ2000.getVelocity())
.scale(1. / duration); .scale(1. / duration);
Vector3 targetAccelerationJ2000 = Vector3 targetAccelerationJ2000 = finalTargetJ2000
finalTargetJ2000.getVelocity().sub(initialTargetJ2000.getVelocity()).scale(1. / duration); .getVelocity()
.sub(initialTargetJ2000.getVelocity())
.scale(1. / duration);
System.out.printf("Estimated time of impact %s\n", last.toUTCString("ISOC", 6)); System.out.printf("Estimated time of impact %s\n", last.toUTCString("ISOC", 6));
System.out.printf("Estimated time to impact %.6f seconds\n", duration); System.out.printf("Estimated time to impact %.6f seconds\n", duration);
@@ -651,8 +611,7 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
System.out.println(); System.out.println();
// Run again with constant accelerations for target and observer // Run again with constant accelerations for target and observer
ifsf = ifsf = new ImpactLocator(
new ImpactLocator(
J2000, J2000,
bodyFixed, bodyFixed,
sbm, sbm,
@@ -681,8 +640,7 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
for (ImpactRecord record : records) { for (ImpactRecord record : records) {
PositionVector p = record.scStateBodyFixed.getPosition(); PositionVector p = record.scStateBodyFixed.getPosition();
VelocityVector v = record.scStateBodyFixed.getVelocity(); VelocityVector v = record.scStateBodyFixed.getVelocity();
System.out.printf( System.out.printf(String.format(
String.format(
"%26s, %13.6e, %13.6e, %13.6e, %13.6e, %13.6e, %13.6e, %12.4f, %12.4f, %12.4f\n", "%26s, %13.6e, %13.6e, %13.6e, %13.6e, %13.6e, %13.6e, %12.4f, %12.4f, %12.4f\n",
record.et.toUTCString("ISOC", 6), record.et.toUTCString("ISOC", 6),
p.getElt(0), p.getElt(0),
@@ -728,7 +686,8 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
ReferenceFrame instrFrame = null; ReferenceFrame instrFrame = null;
if (cl.hasOption("instrumentFrame")) if (cl.hasOption("instrumentFrame"))
instrFrame = new ReferenceFrame(cl.getOptionValue("instrumentFrame").toUpperCase()); instrFrame = new ReferenceFrame(
cl.getOptionValue("instrumentFrame").toUpperCase());
if (instrFrame == null) { if (instrFrame == null) {
logger.error("-instrumentFrame needed for -outputTransform. Exiting."); logger.error("-instrumentFrame needed for -outputTransform. Exiting.");
System.exit(0); System.exit(0);

View File

@@ -162,16 +162,14 @@ and quality planes are saved out.""";
if (namingConvention.isEmpty()) { if (namingConvention.isEmpty()) {
File outFname = new File(outfile); File outFname = new File(outfile);
if (outFname.isDirectory()) { if (outFname.isDirectory()) {
String errMesg = String errMesg = "ERROR! No naming convention specified but output file:"
"ERROR! No naming convention specified but output file:"
+ outfile + outfile
+ " is a directory! Must be a full path to an output file!"; + " is a directory! Must be a full path to an output file!";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
} }
DataInputStream is = DataInputStream is = new DataInputStream(new BufferedInputStream(new FileInputStream(mapletFile)));
new DataInputStream(new BufferedInputStream(new FileInputStream(mapletFile)));
DataInput sigmaInput = null; DataInput sigmaInput = null;
BufferedInputStream sigmabis = null; BufferedInputStream sigmabis = null;
if (sigmasFile != null) { if (sigmasFile != null) {
@@ -179,14 +177,11 @@ and quality planes are saved out.""";
Path filePath = Paths.get(sigmasFile); Path filePath = Paths.get(sigmasFile);
if (!Files.exists(filePath)) { if (!Files.exists(filePath)) {
System.out.println( System.out.println(
"WARNING! sigmas file:" "WARNING! sigmas file:" + filePath.toAbsolutePath() + " not found! Sigmas will default to 0!");
+ filePath.toAbsolutePath()
+ " not found! Sigmas will default to 0!");
} else { } else {
sigmabis = sigmabis = new BufferedInputStream(
new BufferedInputStream(new FileInputStream(filePath.toAbsolutePath().toString())); new FileInputStream(filePath.toAbsolutePath().toString()));
sigmaInput = sigmaInput = swapBytes ? new LittleEndianDataInputStream(sigmabis) : new DataInputStream(sigmabis);
swapBytes ? new LittleEndianDataInputStream(sigmabis) : new DataInputStream(sigmabis);
} }
} }
DataInput qualityInput = null; DataInput qualityInput = null;
@@ -195,15 +190,13 @@ and quality planes are saved out.""";
System.out.println("Parsing " + qualityFile + " for quality values."); System.out.println("Parsing " + qualityFile + " for quality values.");
Path filePath = Paths.get(qualityFile); Path filePath = Paths.get(qualityFile);
if (!Files.exists(filePath)) { if (!Files.exists(filePath)) {
System.out.println( System.out.println("WARNING! quality file:"
"WARNING! quality file:"
+ filePath.toAbsolutePath() + filePath.toAbsolutePath()
+ " not found! Quality values will default to 0!"); + " not found! Quality values will default to 0!");
} else { } else {
qualbis = qualbis = new BufferedInputStream(
new BufferedInputStream(new FileInputStream(filePath.toAbsolutePath().toString())); new FileInputStream(filePath.toAbsolutePath().toString()));
qualityInput = qualityInput = swapBytes ? new LittleEndianDataInputStream(qualbis) : new DataInputStream(qualbis);
swapBytes ? new LittleEndianDataInputStream(qualbis) : new DataInputStream(qualbis);
} }
} }
@@ -374,8 +367,7 @@ and quality planes are saved out.""";
// define this SIGMA as a measurement of height uncertainty // define this SIGMA as a measurement of height uncertainty
hdrBuilder.setVCbyHeaderTag( hdrBuilder.setVCbyHeaderTag(HeaderTag.SIG_DEF, "height uncertainty", HeaderTag.SIG_DEF.comment());
HeaderTag.SIG_DEF, "height uncertainty", HeaderTag.SIG_DEF.comment());
// headerValues.put(HeaderTag.SIG_DEF.toString(), // headerValues.put(HeaderTag.SIG_DEF.toString(),
// new HeaderCard(HeaderTag.SIG_DEF.toString(), "Uncertainty", HeaderTag.SIG_DEF.comment())); // new HeaderCard(HeaderTag.SIG_DEF.toString(), "Uncertainty", HeaderTag.SIG_DEF.comment()));
@@ -407,8 +399,7 @@ and quality planes are saved out.""";
// create FitsData object. Stores data and relevant information about the data // create FitsData object. Stores data and relevant information about the data
FitsDataBuilder fitsDataB = new FitsDataBuilder(data, isGlobal); FitsDataBuilder fitsDataB = new FitsDataBuilder(data, isGlobal);
FitsData fitsData = FitsData fitsData = fitsDataB
fitsDataB
.setV(V) .setV(V)
.setAltProdType(productType) .setAltProdType(productType)
.setDataSource(dataSource) .setDataSource(dataSource)
@@ -438,9 +429,7 @@ and quality planes are saved out.""";
// create llrData object. Stores lat,lon,radius information. Needed to fill out fits header // create llrData object. Stores lat,lon,radius information. Needed to fill out fits header
FitsDataBuilder llrDataB = new FitsDataBuilder(llrData, isGlobal); FitsDataBuilder llrDataB = new FitsDataBuilder(llrData, isGlobal);
FitsData llrFitsData = FitsData llrFitsData = llrDataB.setV(V)
llrDataB
.setV(V)
.setAltProdType(productType) .setAltProdType(productType)
.setDataSource(dataSource) .setDataSource(dataSource)
.setU(ux, UnitDir.UX) .setU(ux, UnitDir.UX)
@@ -468,35 +457,35 @@ and quality planes are saved out.""";
planeList.add(PlaneInfo.SIGMA); planeList.add(PlaneInfo.SIGMA);
planeList.add(PlaneInfo.QUALITY); planeList.add(PlaneInfo.QUALITY);
saveDTMFits( saveDTMFits(hdrBuilder, fitsData, planeList, namingConvention, productType, isGlobal, outfile);
hdrBuilder, fitsData, planeList, namingConvention, productType, isGlobal, outfile);
} }
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("input-map")
Option.builder("input-map").required().hasArg().desc("input maplet file").build()); .required()
options.addOption( .hasArg()
Option.builder("output-fits").required().hasArg().desc("output FITS file").build()); .desc("input maplet file")
options.addOption( .build());
Option.builder("exclude-position") options.addOption(Option.builder("output-fits")
.required()
.hasArg()
.desc("output FITS file")
.build());
options.addOption(Option.builder("exclude-position")
.desc( .desc(
"Only save out the height, albedo, sigma, quality, and hazard planes to the output file. Used for creating NFT MLNs.") "Only save out the height, albedo, sigma, quality, and hazard planes to the output file. Used for creating NFT MLNs.")
.build()); .build());
options.addOption( options.addOption(Option.builder("noHazard")
Option.builder("noHazard")
.desc( .desc(
"Only used in conjunction with -exclude-position. If present then the NFT MLN will NOT include a Hazard plane initially filled with all ones.") "Only used in conjunction with -exclude-position. If present then the NFT MLN will NOT include a Hazard plane initially filled with all ones.")
.build()); .build());
options.addOption( options.addOption(Option.builder("hazardVal")
Option.builder("hazardVal")
.hasArg() .hasArg()
.desc( .desc("Only used in conjunction with -exclude-position. If present then will use the specified value.")
"Only used in conjunction with -exclude-position. If present then will use the specified value.")
.build()); .build());
options.addOption( options.addOption(Option.builder("lsb")
Option.builder("lsb")
.desc( .desc(
""" """
By default the sigmas and quality binary files are assumed to be in big-endian floating By default the sigmas and quality binary files are assumed to be in big-endian floating
@@ -504,43 +493,35 @@ and quality planes are saved out.""";
in little-endian format. For example, products created by SPC executables are OS in little-endian format. For example, products created by SPC executables are OS
dependent and intel Linux OSes use little-endian.""") dependent and intel Linux OSes use little-endian.""")
.build()); .build());
options.addOption( options.addOption(Option.builder("scalFactor")
Option.builder("scalFactor")
.hasArg() .hasArg()
.desc( .desc(
"Enter scale factor used to convert scale to ground sample distance in mm i.e. for SPC maplets the scale factor is 1000000 (km to mm). Set to 1.0e6 by default.") "Enter scale factor used to convert scale to ground sample distance in mm i.e. for SPC maplets the scale factor is 1000000 (km to mm). Set to 1.0e6 by default.")
.build()); .build());
options.addOption( options.addOption(Option.builder("sigmas-file")
Option.builder("sigmas-file")
.hasArg() .hasArg()
.desc( .desc(
"Path to binary sigmas file containing sigma values per pixel, in same order as the maplet file. If this option is omitted, the sigma plane is set to all zeros.") "Path to binary sigmas file containing sigma values per pixel, in same order as the maplet file. If this option is omitted, the sigma plane is set to all zeros.")
.build()); .build());
options.addOption( options.addOption(Option.builder("sigsum-file")
Option.builder("sigsum-file")
.hasArg() .hasArg()
.desc( .desc("Path to ascii sigma summary file, containing the overall sigma value of the maplet.")
"Path to ascii sigma summary file, containing the overall sigma value of the maplet.")
.build()); .build());
options.addOption( options.addOption(Option.builder("sigmaScale")
Option.builder("sigmaScale")
.hasArg() .hasArg()
.desc( .desc(
"Scale sigmas from sigmas-file by <value>. Only applicable if -sigmas-file is used. Defaults to 1 if not specified.") "Scale sigmas from sigmas-file by <value>. Only applicable if -sigmas-file is used. Defaults to 1 if not specified.")
.build()); .build());
options.addOption( options.addOption(Option.builder("mapname")
Option.builder("mapname")
.hasArg() .hasArg()
.desc("Sets the MAP_NAME fits keyword to <mapname>. Default is 'Non-NFT DTM'") .desc("Sets the MAP_NAME fits keyword to <mapname>. Default is 'Non-NFT DTM'")
.build()); .build());
options.addOption( options.addOption(Option.builder("quality-file")
Option.builder("quality-file")
.hasArg() .hasArg()
.desc( .desc(
"Path to binary quality file containing quality values. If this option is omitted, the quality plane is set to all zeros.") "Path to binary quality file containing quality values. If this option is omitted, the quality plane is set to all zeros.")
.build()); .build());
options.addOption( options.addOption(Option.builder("configFile")
Option.builder("configFile")
.hasArg() .hasArg()
.desc( .desc(
""" """
@@ -549,8 +530,7 @@ and quality planes are saved out.""";
that are required by the ALTWG SIS. The values may not be populated or UNK if they cannot that are required by the ALTWG SIS. The values may not be populated or UNK if they cannot
be derived from the data itself. This configuration file is a way to populate those values.""") be derived from the data itself. This configuration file is a way to populate those values.""")
.build()); .build());
options.addOption( options.addOption(Option.builder("namingConvention")
Option.builder("namingConvention")
.hasArg() .hasArg()
.desc( .desc(
""" """
@@ -626,8 +606,7 @@ and quality planes are saved out.""";
public static String parseSigsumFile(String sigsumFile) throws IOException { public static String parseSigsumFile(String sigsumFile) throws IOException {
if (!Files.exists(Paths.get(sigsumFile))) { if (!Files.exists(Paths.get(sigsumFile))) {
System.out.println( System.out.println("Warning! Sigmas summary file:"
"Warning! Sigmas summary file:"
+ sigsumFile + sigsumFile
+ " does not exist! Not " + " does not exist! Not "
+ "able to parse sigma summary."); + "able to parse sigma summary.");
@@ -860,8 +839,7 @@ and quality planes are saved out.""";
// possible renaming of output file // possible renaming of output file
NameConvention nameConvention = NameConvention.parseNameConvention(namingConvention); NameConvention nameConvention = NameConvention.parseNameConvention(namingConvention);
File[] outFiles = File[] outFiles =
NamingFactory.getBaseNameAndCrossRef( NamingFactory.getBaseNameAndCrossRef(nameConvention, hdrBuilder, productType, isGlobal, outfile);
nameConvention, hdrBuilder, productType, isGlobal, outfile);
// check if cross-ref file is not null. If so then output file was renamed to naming convention. // check if cross-ref file is not null. If so then output file was renamed to naming convention.
crossrefFile = outFiles[1]; crossrefFile = outFiles[1];
@@ -881,7 +859,6 @@ and quality planes are saved out.""";
} }
outFitsFname = new File(outputFolder, baseOutputName + ".fits").getAbsolutePath(); outFitsFname = new File(outputFolder, baseOutputName + ".fits").getAbsolutePath();
} }
ProductFits.saveDataCubeFits( ProductFits.saveDataCubeFits(fitsData, planeList, outFitsFname, hdrBuilder, hdrType, crossrefFile);
fitsData, planeList, outFitsFname, hdrBuilder, hdrType, crossrefFile);
} }
} }

View File

@@ -22,6 +22,9 @@
*/ */
package terrasaur.apps; package terrasaur.apps;
import com.beust.jcommander.JCommander;
import com.beust.jcommander.Parameter;
import com.beust.jcommander.ParameterException;
import java.io.BufferedWriter; import java.io.BufferedWriter;
import java.io.File; import java.io.File;
import java.io.FileWriter; import java.io.FileWriter;
@@ -33,9 +36,6 @@ import java.util.ArrayList;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import com.beust.jcommander.JCommander;
import com.beust.jcommander.Parameter;
import com.beust.jcommander.ParameterException;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
@@ -53,8 +53,6 @@ public class OBJ2DSK implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger(); private static final Logger logger = LogManager.getLogger();
@Override @Override
public String shortDescription() { public String shortDescription() {
return "Convert an OBJ shape file to SPICE DSK format."; return "Convert an OBJ shape file to SPICE DSK format.";
@@ -71,11 +69,17 @@ public class OBJ2DSK implements TerrasaurTool {
jcUsage.setColumnSize(100); jcUsage.setColumnSize(100);
jcUsage.usage(builder, 4, arguments.commandDescription); jcUsage.usage(builder, 4, arguments.commandDescription);
return builder.toString(); return builder.toString();
} }
private enum DSK_KEYS { private enum DSK_KEYS {
SURF_NAME, CENTER_NAME, REFFRAME_NAME, NAIF_SURFNAME, NAIF_SURFCODE, NAIF_SURFBODY, METAK, COMMENTFILE SURF_NAME,
CENTER_NAME,
REFFRAME_NAME,
NAIF_SURFNAME,
NAIF_SURFCODE,
NAIF_SURFBODY,
METAK,
COMMENTFILE
} }
private static class Arguments { private static class Arguments {
@@ -87,122 +91,166 @@ public class OBJ2DSK implements TerrasaurTool {
@Parameter(names = "-help", help = true) @Parameter(names = "-help", help = true)
private boolean help; private boolean help;
@Parameter(names = "--latMin", description = "<latMin> Minimum latitude of OBJ in degrees.", @Parameter(
required = false, order = 0) names = "--latMin",
description = "<latMin> Minimum latitude of OBJ in degrees.",
required = false,
order = 0)
private double latMin = -90D; private double latMin = -90D;
@Parameter(names = "--latMax", description = "<latMax> Maximum latitude of OBJ in degrees.", @Parameter(
required = false, order = 1) names = "--latMax",
description = "<latMax> Maximum latitude of OBJ in degrees.",
required = false,
order = 1)
private double latMax = 90D; private double latMax = 90D;
@Parameter(names = "--lonMin", description = "<lonMin> Minimum longitude of OBJ in degrees.", @Parameter(
required = false, order = 2) names = "--lonMin",
description = "<lonMin> Minimum longitude of OBJ in degrees.",
required = false,
order = 2)
private double lonMin = 0; private double lonMin = 0;
@Parameter(names = "--lonMax", description = "<lonMax> Maximum longitude of OBJ in degrees.", @Parameter(
required = false, order = 3) names = "--lonMax",
description = "<lonMax> Maximum longitude of OBJ in degrees.",
required = false,
order = 3)
private double lonMax = 360D; private double lonMax = 360D;
@Parameter(names = "--fitsFile", @Parameter(
description = "<filename> path to DTM fits file containing lat,lon" names = "--fitsFile",
description =
"<filename> path to DTM fits file containing lat,lon"
+ " information as planes. Assumes PLANE1=latitude, PLANE2=longitude. Use in place of specifying lat/lon min/max values.", + " information as planes. Assumes PLANE1=latitude, PLANE2=longitude. Use in place of specifying lat/lon min/max values.",
required = false, order = 4) required = false,
order = 4)
private String fitsFile = ""; private String fitsFile = "";
@Parameter(names = "--fine-scale", @Parameter(
names = "--fine-scale",
description = "<fine-voxel-scale> Floating point value representing the " description = "<fine-voxel-scale> Floating point value representing the "
+ " ratio of the spatial index's fine voxel edge length to the average plate extent. " + " ratio of the spatial index's fine voxel edge length to the average plate extent. "
+ " The 'extent' of a plate in a given coordinate direction is the difference between the maximum and minimum " + " The 'extent' of a plate in a given coordinate direction is the difference between the maximum and minimum "
+ " values of that coordinate attained on the plate. Only required if mkdsk version is " + " values of that coordinate attained on the plate. Only required if mkdsk version is "
+ " lower than 66.", + " lower than 66.",
required = false, order = 11) required = false,
order = 11)
double fineVoxScale = Double.NaN; double fineVoxScale = Double.NaN;
@Parameter(names = "--coarse-scale", @Parameter(
names = "--coarse-scale",
description = " <coarse-voxel-scale>" description = " <coarse-voxel-scale>"
+ " Integer value representing the ratio of the edge length of coarse voxels to" + " Integer value representing the ratio of the edge length of coarse voxels to"
+ " fine voxels. The number must be large enough that the total number of coarse" + " fine voxels. The number must be large enough that the total number of coarse"
+ " voxels is less than the value of MAXCGR, which is currently 1E5." + " voxels is less than the value of MAXCGR, which is currently 1E5."
+ " Only required if mkdsk version is lower than 66.", + " Only required if mkdsk version is lower than 66.",
required = false, order = 12) required = false,
order = 12)
Integer coarseVoxScale = -999; Integer coarseVoxScale = -999;
@Parameter(names = "--useSetupFile", @Parameter(
names = "--useSetupFile",
description = "<inputSetupFile> Use <inputSetupFile>" description = "<inputSetupFile> Use <inputSetupFile>"
+ " instead of the default setup file created by the tool.", + " instead of the default setup file created by the tool.",
required = false, order = 13) required = false,
order = 13)
String inputSetup = ""; String inputSetup = "";
@Parameter(names = "--writesetupFile", @Parameter(
names = "--writesetupFile",
description = "<outputSetupFile> Write the setup file" description = "<outputSetupFile> Write the setup file"
+ " to the specified path instead of writing it as a temporary file which gets deleted" + " to the specified path instead of writing it as a temporary file which gets deleted"
+ " after execution.", + " after execution.",
required = false, order = 14) required = false,
order = 14)
String outsetupFname = ""; String outsetupFname = "";
@Parameter(names = "--keepTempFiles", @Parameter(
names = "--keepTempFiles",
description = "enable this to prevent setup files" description = "enable this to prevent setup files"
+ " from being deleted. Used for debugging purposes to see what is being sent" + " from being deleted. Used for debugging purposes to see what is being sent"
+ " to mkdsk.", + " to mkdsk.",
required = false, order = 15) required = false,
order = 15)
boolean keepTmpFiles = false; boolean keepTmpFiles = false;
@Parameter(names = "--mkFile", @Parameter(
names = "--mkFile",
description = "<spice-meta-kernel-file> path to SPICE meta kernel file." description = "<spice-meta-kernel-file> path to SPICE meta kernel file."
+ " Metakernel only needs to point to leap seconds kernel and a frames kernel that contains" + " Metakernel only needs to point to leap seconds kernel and a frames kernel that contains"
+ " the digital ID to CENTER_NAME and REF_FRAME_NAME lookup table." + " the digital ID to CENTER_NAME and REF_FRAME_NAME lookup table."
+ " This argument is REQUIRED if user does NOT supply a setupFile!", + " This argument is REQUIRED if user does NOT supply a setupFile!",
required = false, order = 4) required = false,
order = 4)
String mkFile = ""; String mkFile = "";
@Parameter(names = "--surfName", @Parameter(
names = "--surfName",
description = "<surfaceName> Allows user to modify the " description = "<surfaceName> Allows user to modify the "
+ " SURFACE_NAME (name of the specific shape data set for the central body)" + " SURFACE_NAME (name of the specific shape data set for the central body)"
+ " used in the default setup file created by the tool. This is a required" + " used in the default setup file created by the tool. This is a required"
+ " keyword in the setup file.", + " keyword in the setup file.",
required = false, order = 5) required = false,
order = 5)
String surfaceName = "BENNU"; String surfaceName = "BENNU";
@Parameter(names = "--centName", description = "<centerName> Allows user to modify the " @Parameter(
names = "--centName",
description = "<centerName> Allows user to modify the "
+ " CENTER_NAME (central body name) used in the default setup file created by the tool. " + " CENTER_NAME (central body name) used in the default setup file created by the tool. "
+ " Can also be an ID code. This is a required keyword in the setup file.", + " Can also be an ID code. This is a required keyword in the setup file.",
required = false, order = 6) required = false,
order = 6)
String centerName = "BENNU"; String centerName = "BENNU";
@Parameter(names = "--refName", description = "<refFrameName> Allows user to modify the " @Parameter(
names = "--refName",
description = "<refFrameName> Allows user to modify the "
+ " REF_FRAME_NAME (reference frame name) used in the default setup file created by the tool. " + " REF_FRAME_NAME (reference frame name) used in the default setup file created by the tool. "
+ " This is a required keyword in the setup file.", required = false, order = 7) + " This is a required keyword in the setup file.",
required = false,
order = 7)
String refFrameName = "IAU_BENNU"; String refFrameName = "IAU_BENNU";
@Parameter(names = "--naif_surfName", @Parameter(
names = "--naif_surfName",
description = "<naif surface name> Allows user to add the " description = "<naif surface name> Allows user to add the "
+ " NAIF_SURFACE_NAME to the default setup file created by the tool. " + " NAIF_SURFACE_NAME to the default setup file created by the tool. "
+ " This may be needed under certain conditions. Optional keyword. " + " This may be needed under certain conditions. Optional keyword. "
+ " Default is not to use it.", + " Default is not to use it.",
required = false, order = 8) required = false,
order = 8)
String naifSurfName = ""; String naifSurfName = "";
@Parameter(names = "--naif_surfCode", @Parameter(
names = "--naif_surfCode",
description = "<integer ID surface code> Allows user to add the " description = "<integer ID surface code> Allows user to add the "
+ " NAIF_SURFACE_CODE to the default setup file created by the tool. " + " NAIF_SURFACE_CODE to the default setup file created by the tool. "
+ " Allows the tool to associate this ID code to the NAIF_SURFACE_NAME. Optional keyword. " + " Allows the tool to associate this ID code to the NAIF_SURFACE_NAME. Optional keyword. "
+ " Default is not to use it.", + " Default is not to use it.",
required = false, order = 9) required = false,
order = 9)
String naifSurfCode = ""; String naifSurfCode = "";
@Parameter(names = "--naif_surfBody", @Parameter(
names = "--naif_surfBody",
description = "<integer body ID code> Allows user to add the " description = "<integer body ID code> Allows user to add the "
+ " NAIF_SURFACE_BODY to the default setup file created by the tool. " + " NAIF_SURFACE_BODY to the default setup file created by the tool. "
+ " This may be needed under certain conditions. Optional keyword." + " This may be needed under certain conditions. Optional keyword."
+ " Default is not to use it.", + " Default is not to use it.",
required = false, order = 10) required = false,
order = 10)
String naifSurfBody = ""; String naifSurfBody = "";
@Parameter(names = "--cmtFile", @Parameter(
names = "--cmtFile",
description = "<commentFile> Specify the comment file" description = "<commentFile> Specify the comment file"
+ " that mkdsk will add to the DSK. Comment file is an ASCII file containing" + " that mkdsk will add to the DSK. Comment file is an ASCII file containing"
+ " additional information about the DSK. Default is single space.", + " additional information about the DSK. Default is single space.",
required = false, order = 11) required = false,
order = 11)
String cmtFile = " "; String cmtFile = " ";
@Parameter(names = "-shortDescription", hidden = true) @Parameter(names = "-shortDescription", hidden = true)
@@ -222,7 +270,6 @@ public class OBJ2DSK implements TerrasaurTool {
+ " output dsk file\n" + " output dsk file\n"
+ " NOTE: MUST set --metakFile if not supplying a custom setup file!") + " NOTE: MUST set --metakFile if not supplying a custom setup file!")
private List<String> files = new ArrayList<>(); private List<String> files = new ArrayList<>();
} }
private final Double fineVoxScale; private final Double fineVoxScale;
@@ -269,7 +316,6 @@ public class OBJ2DSK implements TerrasaurTool {
System.out.println(ex.getMessage()); System.out.println(ex.getMessage());
command = new JCommander(arg); command = new JCommander(arg);
System.exit(1); System.exit(1);
} }
if ((args.length < 1) || (arg.help)) { if ((args.length < 1) || (arg.help)) {
@@ -323,7 +369,6 @@ public class OBJ2DSK implements TerrasaurTool {
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();
OBJ2DSK obj2dsk; OBJ2DSK obj2dsk;
if ((Double.isNaN(arg.fineVoxScale)) || (arg.coarseVoxScale < 0)) { if ((Double.isNaN(arg.fineVoxScale)) || (arg.coarseVoxScale < 0)) {
obj2dsk = new OBJ2DSK(); obj2dsk = new OBJ2DSK();
@@ -355,8 +400,8 @@ public class OBJ2DSK implements TerrasaurTool {
} }
System.out.println("Creating default setup file"); System.out.println("Creating default setup file");
setupFile = createSetup(latLonMinMax, obj2dsk.fineVoxScale, obj2dsk.coarseVoxScale, dskParams, setupFile =
outsetupFname); createSetup(latLonMinMax, obj2dsk.fineVoxScale, obj2dsk.coarseVoxScale, dskParams, outsetupFname);
if (keepTmpFiles) { if (keepTmpFiles) {
System.out.println("setup file created here:" + outsetupFname); System.out.println("setup file created here:" + outsetupFname);
} else { } else {
@@ -372,7 +417,6 @@ public class OBJ2DSK implements TerrasaurTool {
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
System.out.println("Using custom setup file:" + inputSetup); System.out.println("Using custom setup file:" + inputSetup);
} }
obj2dsk.run(inFile, outFile, latLonMinMax, setupFile, outsetupFname, keepTmpFiles); obj2dsk.run(inFile, outFile, latLonMinMax, setupFile, outsetupFname, keepTmpFiles);
@@ -416,8 +460,14 @@ public class OBJ2DSK implements TerrasaurTool {
} }
public void run(String infile, String outfile, Map<String, Double> latLonMinMax, File setupFile, public void run(
String outsetupFname, boolean keepTmpFiles) throws Exception { String infile,
String outfile,
Map<String, Double> latLonMinMax,
File setupFile,
String outsetupFname,
boolean keepTmpFiles)
throws Exception {
System.out.println("Running OBJ2DSK."); System.out.println("Running OBJ2DSK.");
System.out.println("FINE_VOXEL_SCALE = " + Double.toString(fineVoxScale)); System.out.println("FINE_VOXEL_SCALE = " + Double.toString(fineVoxScale));
@@ -454,15 +504,13 @@ public class OBJ2DSK implements TerrasaurTool {
PolyDataUtil.saveShapeModelAsOBJ(inpolydata, shapeModel.getAbsolutePath()); PolyDataUtil.saveShapeModelAsOBJ(inpolydata, shapeModel.getAbsolutePath());
// Delete dsk file if already exists since otherwise mkdsk will complain // Delete dsk file if already exists since otherwise mkdsk will complain
if (new File(outfile).isFile()) if (new File(outfile).isFile()) new File(outfile).delete();
new File(outfile).delete();
String command = "mkdsk -setup " + setupFile.getAbsolutePath() + " -input " String command = "mkdsk -setup " + setupFile.getAbsolutePath() + " -input " + shapeModel.getAbsolutePath()
+ shapeModel.getAbsolutePath() + " -output " + outfile; + " -output " + outfile;
ProcessUtils.runProgramAndWait(command, null, false); ProcessUtils.runProgramAndWait(command, null, false);
} }
/** /**
* Create the setup file for mkdsk executable. * Create the setup file for mkdsk executable.
* *
@@ -473,8 +521,12 @@ public class OBJ2DSK implements TerrasaurTool {
* @param setupFname * @param setupFname
* @return * @return
*/ */
private static File createSetup(Map<String, Double> latLonCorners, Double fineVoxScale, private static File createSetup(
Integer coarseVoxScale, Map<DSK_KEYS, String> dskParams, String setupFname) { Map<String, Double> latLonCorners,
Double fineVoxScale,
Integer coarseVoxScale,
Map<DSK_KEYS, String> dskParams,
String setupFname) {
// evaluate latlon corners. Exit program if any are NaN. // evaluate latlon corners. Exit program if any are NaN.
evaluateCorners(latLonCorners); evaluateCorners(latLonCorners);
@@ -492,8 +544,8 @@ public class OBJ2DSK implements TerrasaurTool {
} else { } else {
setupFile = new File(setupFname); setupFile = new File(setupFname);
} }
System.out System.out.println("Setup file for mkdsk created here:"
.println("Setup file for mkdsk created here:" + setupFile.getAbsolutePath().toString()); + setupFile.getAbsolutePath().toString());
// relativize the path to the metakernel file. Do this because mkdsk has a limit on the string // relativize the path to the metakernel file. Do this because mkdsk has a limit on the string
// length to the metakernel. Get normalized absolute path to mkFile in case the user enters a // length to the metakernel. Get normalized absolute path to mkFile in case the user enters a
@@ -509,13 +561,12 @@ public class OBJ2DSK implements TerrasaurTool {
String spicefile = relPath.toString(); String spicefile = relPath.toString();
if (spicefile.length() > 80) { if (spicefile.length() > 80) {
System.out.println("Error: pointer to SPICE metakernel kernel file may not be longer than" System.out.println(
+ " 80 characters."); "Error: pointer to SPICE metakernel kernel file may not be longer than" + " 80 characters.");
System.out.println("The paths inside the metakernel file can be as long as 255 characters."); System.out.println("The paths inside the metakernel file can be as long as 255 characters.");
System.exit(1); System.exit(1);
} }
// create the content of setup file // create the content of setup file
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
sb.append("\\begindata\n"); sb.append("\\begindata\n");
@@ -533,23 +584,20 @@ public class OBJ2DSK implements TerrasaurTool {
sb.append("INPUT_DATA_UNITS = ( 'ANGLES = DEGREES'\n"); sb.append("INPUT_DATA_UNITS = ( 'ANGLES = DEGREES'\n");
sb.append(" 'DISTANCES = KILOMETERS' )\n"); sb.append(" 'DISTANCES = KILOMETERS' )\n");
sb.append("COORDINATE_SYSTEM = 'LATITUDINAL'\n"); sb.append("COORDINATE_SYSTEM = 'LATITUDINAL'\n");
String valueString = String.format("MINIMUM_LATITUDE = %.5f\n", String valueString =
latLonCorners.get(HeaderTag.MINLAT.toString())); String.format("MINIMUM_LATITUDE = %.5f\n", latLonCorners.get(HeaderTag.MINLAT.toString()));
sb.append(valueString); sb.append(valueString);
// out.write("MAXIMUM_LATITUDE = 90\n"); // out.write("MAXIMUM_LATITUDE = 90\n");
valueString = String.format("MAXIMUM_LATITUDE = %.5f\n", valueString = String.format("MAXIMUM_LATITUDE = %.5f\n", latLonCorners.get(HeaderTag.MAXLAT.toString()));
latLonCorners.get(HeaderTag.MAXLAT.toString()));
sb.append(valueString); sb.append(valueString);
// out.write("MINIMUM_LONGITUDE = -180\n"); // out.write("MINIMUM_LONGITUDE = -180\n");
valueString = String.format("MINIMUM_LONGITUDE = %.5f\n", valueString = String.format("MINIMUM_LONGITUDE = %.5f\n", latLonCorners.get(HeaderTag.MINLON.toString()));
latLonCorners.get(HeaderTag.MINLON.toString()));
sb.append(valueString); sb.append(valueString);
// out.write("MAXIMUM_LONGITUDE = 180\n"); // out.write("MAXIMUM_LONGITUDE = 180\n");
valueString = String.format("MAXIMUM_LONGITUDE = %.5f\n", valueString = String.format("MAXIMUM_LONGITUDE = %.5f\n", latLonCorners.get(HeaderTag.MAXLON.toString()));
latLonCorners.get(HeaderTag.MAXLON.toString()));
sb.append(valueString); sb.append(valueString);
sb.append("DATA_TYPE = 2\n"); sb.append("DATA_TYPE = 2\n");
@@ -593,7 +641,6 @@ public class OBJ2DSK implements TerrasaurTool {
System.exit(1); System.exit(1);
} }
return setupFile; return setupFile;
} }
/** /**
@@ -611,5 +658,4 @@ public class OBJ2DSK implements TerrasaurTool {
} }
} }
} }
} }

View File

@@ -172,7 +172,9 @@ public class PointCloudFormatConverter implements TerrasaurTool {
double lon = Math.toRadians(Double.parseDouble(parts[0].trim())); double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
double lat = Math.toRadians(Double.parseDouble(parts[1].trim())); double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
double range = Double.parseDouble(parts[2].trim()); double range = Double.parseDouble(parts[2].trim());
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray(); double[] xyz = new Vector3D(lon, lat)
.scalarMultiply(range)
.toArray();
pointsXYZ.InsertNextPoint(xyz); pointsXYZ.InsertNextPoint(xyz);
} else { } else {
double[] xyz = new double[3]; double[] xyz = new double[3];
@@ -191,8 +193,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
case BIN3: case BIN3:
case BIN4: case BIN4:
case BIN7: case BIN7:
try (DataInputStream dis = try (DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
while (dis.available() > 0) { while (dis.available() > 0) {
if (inFormat == FORMATS.BIN7) { if (inFormat == FORMATS.BIN7) {
// skip time field // skip time field
@@ -202,7 +203,8 @@ public class PointCloudFormatConverter implements TerrasaurTool {
double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis)); double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis)); double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double range = BinaryUtils.readDoubleAndSwap(dis); double range = BinaryUtils.readDoubleAndSwap(dis);
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray(); double[] xyz =
new Vector3D(lon, lat).scalarMultiply(range).toArray();
pointsXYZ.InsertNextPoint(xyz); pointsXYZ.InsertNextPoint(xyz);
} else { } else {
double[] xyz = new double[3]; double[] xyz = new double[3];
@@ -248,8 +250,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
BoundingBox clipped = bbox.getScaledBoundingBox(clip); BoundingBox clipped = bbox.getScaledBoundingBox(clip);
vtkPoints clippedPoints = new vtkPoints(); vtkPoints clippedPoints = new vtkPoints();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) { for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
if (clipped.contains(pointsXYZ.GetPoint(i))) if (clipped.contains(pointsXYZ.GetPoint(i))) clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
} }
pointsXYZ = clippedPoints; pointsXYZ = clippedPoints;
polyData = null; polyData = null;
@@ -305,9 +306,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
} }
} else { } else {
if (halfSize < 0 || groundSampleDistance < 0) { if (halfSize < 0 || groundSampleDistance < 0) {
System.out.printf( logger.error("Must supply -halfSize and -groundSampleDistance for {} output", outFormat);
"Must supply -halfSize and -groundSampleDistance for %s output\n",
outFormat);
return; return;
} }
@@ -366,98 +365,80 @@ public class PointCloudFormatConverter implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("inputFormat")
Option.builder("inputFormat")
.hasArg() .hasArg()
.desc( .desc("Format of input file. If not present format will be inferred from inputFile extension.")
"Format of input file. If not present format will be inferred from inputFile extension.")
.build()); .build());
options.addOption( options.addOption(Option.builder("inputFile")
Option.builder("inputFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of input file.") .desc("Required. Name of input file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outputFormat")
Option.builder("outputFormat")
.hasArg() .hasArg()
.desc( .desc("Format of output file. If not present format will be inferred from outputFile extension.")
"Format of output file. If not present format will be inferred from outputFile extension.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outputFile")
Option.builder("outputFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of output file.") .desc("Required. Name of output file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("inllr")
Option.builder("inllr")
.desc( .desc(
"Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.") "Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outllr")
Option.builder("outllr")
.desc( .desc(
"Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.") "Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.")
.build()); .build());
options.addOption( options.addOption(Option.builder("centerXYZ")
Option.builder("centerXYZ")
.hasArg() .hasArg()
.desc( .desc(
"Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated" "Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated"
+ " by commas.") + " by commas.")
.build()); .build());
options.addOption( options.addOption(Option.builder("centerLonLat")
Option.builder("centerLonLat")
.hasArg() .hasArg()
.desc( .desc(
"Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated" "Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated"
+ " by a comma. Shape will be centered on the point closest to this lon,lat pair.") + " by a comma. Shape will be centered on the point closest to this lon,lat pair.")
.build()); .build());
options.addOption( options.addOption(Option.builder("halfSize")
Option.builder("halfSize")
.hasArg() .hasArg()
.desc( .desc(
"Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).") "Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).")
.build()); .build());
options.addOption( options.addOption(Option.builder("groundSampleDistance")
Option.builder("groundSampleDistance")
.hasArg() .hasArg()
.desc( .desc(
"Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. " "Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. "
+ "Units are the same as the input file, usually km.") + "Units are the same as the input file, usually km.")
.build()); .build());
options.addOption( options.addOption(Option.builder("mapRadius")
Option.builder("mapRadius")
.hasArg() .hasArg()
.desc( .desc(
"Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within " "Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within "
+ "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).") + "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).")
.build()); .build());
options.addOption( options.addOption(Option.builder("gmtArgs")
Option.builder("gmtArgs")
.hasArg() .hasArg()
.longOpt("gmt-args") .longOpt("gmt-args")
.desc( .desc(
"Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.") "Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.")
.build()); .build());
options.addOption( options.addOption(Option.builder("clip")
Option.builder("clip")
.hasArg() .hasArg()
.desc( .desc(
"Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).") "Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).")
@@ -484,12 +465,10 @@ public class PointCloudFormatConverter implements TerrasaurTool {
boolean inLLR = cl.hasOption("inllr"); boolean inLLR = cl.hasOption("inllr");
boolean outLLR = cl.hasOption("outllr"); boolean outLLR = cl.hasOption("outllr");
FORMATS inFormat = FORMATS inFormat = cl.hasOption("inputFormat")
cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase()) ? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile")); : FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
FORMATS outFormat = FORMATS outFormat = cl.hasOption("outputFormat")
cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase()) ? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile")); : FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
@@ -516,8 +495,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
if (cl.hasOption("centerLonLat")) { if (cl.hasOption("centerLonLat")) {
String[] params = cl.getOptionValue("centerLonLat").split(","); String[] params = cl.getOptionValue("centerLonLat").split(",");
Vector3D lcDir = Vector3D lcDir = new Vector3D(
new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())), Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim()))); Math.toRadians(Double.parseDouble(params[1].trim())));
double[] center = null; double[] center = null;

View File

@@ -0,0 +1,399 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.awt.geom.Path2D;
import java.awt.geom.Point2D;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import nom.tam.fits.FitsException;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.commons.math3.geometry.euclidean.twod.Vector2D;
import org.apache.commons.math3.geometry.euclidean.twod.hull.MonotoneChain;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import picante.math.coords.LatitudinalVector;
import spice.basic.SpiceException;
import spice.basic.Vector3;
import terrasaur.enums.FORMATS;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.VectorStatistics;
import terrasaur.utils.tessellation.StereographicProjection;
import vtk.vtkPoints;
public class PointCloudOverlap implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Find points in a point cloud which overlap a reference point cloud.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"\nThis program finds all points in the input point cloud which overlap the points in a reference point cloud.\n\n"
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK. Supported output formats are ASCII, BINARY, L2, and VTK. "
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.\n\n"
+ "A plane is fit to the reference point cloud and all points in each cloud are projected onto this plane. Any point in the "
+ "projected input cloud which falls within the outline of the projected reference cloud is considered to be overlapping.";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private Path2D.Double polygon;
private StereographicProjection proj;
/*- Useful for debugging
private vtkPoints ref2DPoints;
private vtkPoints input2DPoints;
private vtkPolyData polygonPolyData;
private vtkCellArray polygonCells;
private vtkPoints polygonPoints;
private vtkDoubleArray polygonSuccessArray;
*/
public PointCloudOverlap(Collection<double[]> refPoints) {
if (refPoints != null) {
VectorStatistics vStats = new VectorStatistics();
for (double[] pt : refPoints) vStats.add(new Vector3(pt));
Vector3D centerXYZ = vStats.getMean();
proj = new StereographicProjection(new LatitudinalVector(1, centerXYZ.getDelta(), centerXYZ.getAlpha()));
createRefPolygon(refPoints);
}
}
public StereographicProjection getProjection() {
return proj;
}
private void createRefPolygon(Collection<double[]> refPoints) {
List<Vector2D> stereographicPoints = new ArrayList<>();
for (double[] refPt : refPoints) {
Vector3D point3D = new Vector3D(refPt);
Point2D point = proj.forward(point3D.getDelta(), point3D.getAlpha());
stereographicPoints.add(new Vector2D(point.getX(), point.getY()));
}
/*-
ref2DPoints = new vtkPoints();
input2DPoints = new vtkPoints();
polygonPolyData = new vtkPolyData();
polygonCells = new vtkCellArray();
polygonPoints = new vtkPoints();
polygonSuccessArray = new vtkDoubleArray();
polygonSuccessArray.SetName("success")
polygonPolyData.SetPoints(polygonPoints);
polygonPolyData.SetLines(polygonCells);
polygonPolyData.GetCellData().AddArray(polygonSuccessArray);
for (Vector2D refPoint : refPoints)
ref2DPoints.InsertNextPoint(refPoint.getX(), refPoint.getY(), 0);
*/
MonotoneChain mc = new MonotoneChain();
List<Vector2D> vertices = new ArrayList<>(mc.findHullVertices(stereographicPoints));
/*-
for (int i = 1; i < vertices.size(); i++) {
Vector2D lastPt = vertices.get(i - 1);
Vector2D thisPt = vertices.get(i);
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
thisPt.getY(), lastPt.distance(thisPt));
}
Vector2D lastPt = vertices.get(vertices.size() - 1);
Vector2D thisPt = vertices.get(0);
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
thisPt.getY(), lastPt.distance(thisPt));
*/
// int id0 = 0;
for (Vector2D vertex : vertices) {
// int id1 = polygonPoints.InsertNextPoint(vertex.getX(), vertex.getY(), 0);
if (polygon == null) {
polygon = new Path2D.Double();
polygon.moveTo(vertex.getX(), vertex.getY());
} else {
polygon.lineTo(vertex.getX(), vertex.getY());
/*-
vtkLine line = new vtkLine();
line.GetPointIds().SetId(0, id0);
line.GetPointIds().SetId(1, id1);
polygonCells.InsertNextCell(line);
*/
}
// id0 = id1;
}
polygon.closePath();
/*-
vtkPolyDataWriter writer = new vtkPolyDataWriter();
writer.SetInputData(polygonPolyData);
writer.SetFileName("polygon2D.vtk");
writer.SetFileTypeToBinary();
writer.Update();
writer = new vtkPolyDataWriter();
polygonPolyData = new vtkPolyData();
polygonPolyData.SetPoints(ref2DPoints);
writer.SetInputData(polygonPolyData);
writer.SetFileName("refPoints.vtk");
writer.SetFileTypeToBinary();
writer.Update();
*/
}
public boolean isEnclosed(double[] xyz) {
Vector3D point = new Vector3D(xyz);
Point2D projected = proj.forward(point.getDelta(), point.getAlpha());
return polygon.contains(projected.getX(), projected.getY());
}
/**
* @param inputPoints points to consider
* @param scale scale factor
* @return indices of all points inside the scaled polygon
*/
public List<Integer> scalePoints(List<double[]> inputPoints, double scale) {
List<Vector2D> projected = new ArrayList<>();
for (double[] inputPoint : inputPoints) {
Vector3D point = new Vector3D(inputPoint);
Point2D projectedPoint = proj.forward(point.getDelta(), point.getAlpha());
projected.add(new Vector2D(projectedPoint.getX(), projectedPoint.getY()));
}
Vector2D center = new Vector2D(0, 0);
for (Vector2D inputPoint : projected) center = center.add(inputPoint);
center = center.scalarMultiply(1. / inputPoints.size());
List<Vector2D> translatedPoints = new ArrayList<>();
for (Vector2D inputPoint : projected) translatedPoints.add(inputPoint.subtract(center));
Path2D.Double thisPolygon = null;
MonotoneChain mc = new MonotoneChain();
Collection<Vector2D> vertices = mc.findHullVertices(translatedPoints);
for (Vector2D vertex : vertices) {
if (thisPolygon == null) {
thisPolygon = new Path2D.Double();
thisPolygon.moveTo(scale * vertex.getX(), scale * vertex.getY());
} else {
thisPolygon.lineTo(scale * vertex.getX(), scale * vertex.getY());
}
}
thisPolygon.closePath();
List<Integer> indices = new ArrayList<>();
for (int i = 0; i < projected.size(); i++) {
Vector2D inputPoint = projected.get(i);
if (thisPolygon.contains(inputPoint.getX() - center.getX(), inputPoint.getY() - center.getY()))
indices.add(i);
}
return indices;
}
private static Options defineOptions() {
Options options = new Options();
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc("Format of input file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(Option.builder("inllr")
.desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("referenceFormat")
.hasArg()
.desc("Format of reference file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("referenceFile")
.required()
.hasArg()
.desc("Required. Name of reference file.")
.build());
options.addOption(Option.builder("refllr")
.desc(
"If present, reference values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc("Format of output file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("outputFile")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(Option.builder("outllr")
.desc(
"If present, output values will be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("scale")
.hasArg()
.desc("Value to scale bounding box containing intersect region. Default is 1.0.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException, IOException, InterruptedException, FitsException {
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
TerrasaurTool defaultOBJ = new PointCloudOverlap(null);
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// Read the reference file
FORMATS refFormat = cl.hasOption("referenceFormat")
? FORMATS.valueOf(cl.getOptionValue("referenceFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("referenceFile"));
String refFile = cl.getOptionValue("referenceFile");
boolean refLLR = cl.hasOption("refllr");
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(refFormat, FORMATS.VTK);
pcfc.read(refFile, refLLR);
vtkPoints referencePoints = pcfc.getPoints();
logger.info("{} points read from {}", referencePoints.GetNumberOfPoints(), refFile);
List<double[]> refPts = new ArrayList<>();
for (int i = 0; i < referencePoints.GetNumberOfPoints(); i++) {
refPts.add(referencePoints.GetPoint(i));
}
// create the overlap object and set the enclosing polygon
PointCloudOverlap pco = new PointCloudOverlap(refPts);
// Read the input point cloud
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr");
pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
pcfc.read(inFile, inLLR);
vtkPoints inputPoints = pcfc.getPoints();
logger.info("{} points read from {}", inputPoints.GetNumberOfPoints(), inFile);
List<Integer> enclosedIndices = new ArrayList<>();
for (int i = 0; i < inputPoints.GetNumberOfPoints(); i++) {
double[] pt = inputPoints.GetPoint(i);
if (pco.isEnclosed(pt)) enclosedIndices.add(i);
}
if (cl.hasOption("scale")) {
List<double[]> pts = new ArrayList<>();
for (Integer i : enclosedIndices) pts.add(inputPoints.GetPoint(i));
// this list includes which of the enclosed points are inside the scaled polygon
List<Integer> theseIndices = pco.scalePoints(pts, Double.parseDouble(cl.getOptionValue("scale")));
// now relate this list back to the original list of points
List<Integer> newIndices = new ArrayList<>();
for (Integer i : theseIndices) newIndices.add(enclosedIndices.get(i));
enclosedIndices = newIndices;
}
VectorStatistics xyzStats = new VectorStatistics();
VectorStatistics xyStats = new VectorStatistics();
for (Integer i : enclosedIndices) {
double[] thisPt = inputPoints.GetPoint(i);
Vector3D thisPt3D = new Vector3D(thisPt);
xyzStats.add(thisPt3D);
Point2D projectedPt = pco.getProjection().forward(thisPt3D.getDelta(), thisPt3D.getAlpha());
xyStats.add(new Vector3(projectedPt.getX(), projectedPt.getY(), 0));
}
logger.info(
"Center XYZ: {}, {}, {}",
xyzStats.getMean().getX(),
xyzStats.getMean().getY(),
xyzStats.getMean().getZ());
Vector3D centerXYZ = xyzStats.getMean();
logger.info(
"Center lon, lat: {}, {}\n",
Math.toDegrees(centerXYZ.getAlpha()),
Math.toDegrees(centerXYZ.getDelta()));
logger.info(
"xmin/xmax/extent: {}/{}/{}\n",
xyzStats.getMin().getX(),
xyzStats.getMax().getX(),
xyzStats.getMax().getX() - xyzStats.getMin().getX());
logger.info(
"ymin/ymax/extent: {}/{}/{}\n",
xyzStats.getMin().getY(),
xyzStats.getMax().getY(),
xyzStats.getMax().getY() - xyzStats.getMin().getY());
logger.info(
"zmin/zmax/extent: {}/{}/{}\n",
xyzStats.getMin().getZ(),
xyzStats.getMax().getZ(),
xyzStats.getMax().getZ() - xyzStats.getMin().getZ());
FORMATS outFormat = cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
vtkPoints pointsToWrite = new vtkPoints();
for (Integer i : enclosedIndices) pointsToWrite.InsertNextPoint(inputPoints.GetPoint(i));
pcfc = new PointCloudFormatConverter(FORMATS.VTK, outFormat);
pcfc.setPoints(pointsToWrite);
String outputFilename = cl.getOptionValue("outputFile");
pcfc.write(outputFilename, cl.hasOption("outllr"));
if (new File(outputFilename).exists()) {
logger.info("{} points written to {}", pointsToWrite.GetNumberOfPoints(), outputFilename);
} else {
logger.error("Could not write {}", outputFilename);
}
logger.info("Finished");
}
}
// TODO write out center of output pointcloud

View File

@@ -156,8 +156,7 @@ public class PointCloudToPlane implements TerrasaurTool {
if (dist > halfExtent) halfExtent = dist; if (dist > halfExtent) halfExtent = dist;
} }
ProjectionOrthographic p = ProjectionOrthographic p = new ProjectionOrthographic(
new ProjectionOrthographic(
config.width(), config.width(),
config.height(), config.height(),
CoordConverters.convertToLatitudinal( CoordConverters.convertToLatitudinal(
@@ -169,8 +168,7 @@ public class PointCloudToPlane implements TerrasaurTool {
canvas.plot(data); canvas.plot(data);
((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true); ((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true);
canvas.drawColorBar( canvas.drawColorBar(ImmutableColorBar.builder()
ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10)) .rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp) .ramp(ramp)
.numTicks(5) .numTicks(5)
@@ -196,8 +194,7 @@ public class PointCloudToPlane implements TerrasaurTool {
canvas.drawAxes(); canvas.drawAxes();
canvas.plot(data); canvas.plot(data);
canvas.drawColorBar( canvas.drawColorBar(ImmutableColorBar.builder()
ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10)) .rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp) .ramp(ramp)
.numTicks(5) .numTicks(5)
@@ -209,72 +206,54 @@ public class PointCloudToPlane implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("inputFormat")
Option.builder("inputFormat")
.hasArg() .hasArg()
.desc( .desc("Format of input file. If not present format is inferred from inputFile extension.")
"Format of input file. If not present format is inferred from inputFile extension.")
.build()); .build());
options.addOption( options.addOption(Option.builder("inputFile")
Option.builder("inputFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of input file.") .desc("Required. Name of input file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("inllr")
Option.builder("inllr")
.desc( .desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.") "If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build()); .build());
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outputFile")
Option.builder("outputFile")
.hasArg() .hasArg()
.desc( .desc(
"Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top " "Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top "
+ "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert " + "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert "
+ "from global to local:\n transformed = rotation.mxv(point.sub(translation))") + "from global to local:\n transformed = rotation.mxv(point.sub(translation))")
.build()); .build());
options.addOption( options.addOption(Option.builder("translate")
Option.builder("translate")
.hasArg() .hasArg()
.desc( .desc("Translate surface points and spacecraft position. "
"Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas. " + "Specify by three floating point numbers separated by commas. "
+ "Default is to use centroid of input point cloud.") + "Default is to use centroid of input point cloud.")
.build()); .build());
options.addOption( options.addOption(Option.builder("plotXYZ")
Option.builder("plotXYZ")
.hasArg() .hasArg()
.desc( .desc("Plot X vs Y (in the local frame) colored by Z. " + "Argument is the name of PNG file to write.")
"Plot X vs Y (in the local frame) colored by Z. "
+ "Argument is the name of PNG file to write.")
.build()); .build());
options.addOption( options.addOption(Option.builder("plotXYR")
Option.builder("plotXYR")
.hasArg() .hasArg()
.desc( .desc("Plot X vs Y (in the local frame) colored by R. " + "Argument is the name of PNG file to write.")
"Plot X vs Y (in the local frame) colored by R. "
+ "Argument is the name of PNG file to write.")
.build()); .build());
options.addOption( options.addOption(Option.builder("slope")
Option.builder("slope") .desc("Choose local coordinate frame such that Z points normal to the plane "
.desc(
"Choose local coordinate frame such that Z points normal to the plane "
+ "and X points along the direction of steepest descent.") + "and X points along the direction of steepest descent.")
.build()); .build());
return options; return options;
@@ -297,8 +276,7 @@ public class PointCloudToPlane implements TerrasaurTool {
String inFile = cl.getOptionValue("inputFile"); String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr"); boolean inLLR = cl.hasOption("inllr");
FORMATS inFormat = FORMATS inFormat = cl.hasOption("inputFormat")
cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase()) ? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(inFile); : FORMATS.formatFromExtension(inFile);
@@ -314,29 +292,23 @@ public class PointCloudToPlane implements TerrasaurTool {
Vector3 translation; Vector3 translation;
if (cl.hasOption("translate")) { if (cl.hasOption("translate")) {
translation = translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
pctp.getGMU().setTranslation(translation.toArray()); pctp.getGMU().setTranslation(translation.toArray());
} }
pctp.getGMU().calculateTransformation(); pctp.getGMU().calculateTransformation();
List<Vector3> globalPts = new ArrayList<>(); List<Vector3> globalPts = new ArrayList<>();
for (int i = 0; i < points.GetNumberOfPoints(); i++) for (int i = 0; i < points.GetNumberOfPoints(); i++) globalPts.add(new Vector3(points.GetPoint(i)));
globalPts.add(new Vector3(points.GetPoint(i)));
double[][] transformation = pctp.getGMU().getTransformation(); double[][] transformation = pctp.getGMU().getTransformation();
StringBuilder sb = StringBuilder sb = new StringBuilder(String.format(
new StringBuilder(
String.format(
"translation vector:\n%24.16e%24.16e%24.16e\n", "translation vector:\n%24.16e%24.16e%24.16e\n",
transformation[0][3], transformation[1][3], transformation[2][3])); transformation[0][3], transformation[1][3], transformation[2][3]));
logger.info(sb.toString()); logger.info(sb.toString());
sb = new StringBuilder("rotation matrix:\n"); sb = new StringBuilder("rotation matrix:\n");
for (int i = 0; i < 3; i++) for (int i = 0; i < 3; i++)
sb.append( sb.append(String.format(
String.format( "%24.16e%24.16e%24.16e\n", transformation[i][0], transformation[i][1], transformation[i][2]));
"%24.16e%24.16e%24.16e\n",
transformation[i][0], transformation[i][1], transformation[i][2]));
logger.info(sb.toString()); logger.info(sb.toString());
Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation()); Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation());

View File

@@ -24,7 +24,6 @@ package terrasaur.apps;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Map; import java.util.Map;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option; import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;
@@ -63,8 +62,11 @@ public class PrintShapeModelStatistics implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("objFile")
Option.builder("objFile").required().hasArg().desc("Path to OBJ file.").build()); .required()
.hasArg()
.desc("Path to OBJ file.")
.build());
return options; return options;
} }

View File

@@ -172,8 +172,7 @@ public class RangeFromSumFile implements TerrasaurTool {
polyData.GetPoint(id1, pt1); polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2); polyData.GetPoint(id2, pt2);
TriangularFacet facet = TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
Vector3 center3 = MathConversions.toVector3(facet.getCenter()); Vector3 center3 = MathConversions.toVector3(facet.getCenter());
Vector3D center3D = MathConversions.toVector3D(facet.getCenter()); Vector3D center3D = MathConversions.toVector3D(facet.getCenter());
@@ -185,7 +184,6 @@ public class RangeFromSumFile implements TerrasaurTool {
tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D); tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D);
incidence = Vector3D.angle(sunXYZ, normal3D); incidence = Vector3D.angle(sunXYZ, normal3D);
emission = Vector3D.angle(scPos, normal3D); emission = Vector3D.angle(scPos, normal3D);
phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D)); phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D));
@@ -193,7 +191,8 @@ public class RangeFromSumFile implements TerrasaurTool {
try { try {
// scPos is in body fixed coordinates // scPos is in body fixed coordinates
Plane p = new Plane(normal3, center3); Plane p = new Plane(normal3, center3);
Vector3 projectedNorth = p.project(new Vector3(0, 0, 1).add(center3)).sub(center3); Vector3 projectedNorth =
p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3); Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3);
scAzimuth = projected.sep(projectedNorth); scAzimuth = projected.sep(projectedNorth);
@@ -201,7 +200,8 @@ public class RangeFromSumFile implements TerrasaurTool {
scElevation = Math.PI / 2 - emission; scElevation = Math.PI / 2 - emission;
// sunXYZ is a unit vector pointing to the sun // sunXYZ is a unit vector pointing to the sun
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3)).sub(center3); projected = p.project(MathConversions.toVector3(sunXYZ).add(center3))
.sub(center3);
sunAzimuth = projected.sep(projectedNorth); sunAzimuth = projected.sep(projectedNorth);
if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth; if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth;
@@ -295,53 +295,45 @@ public class RangeFromSumFile implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("sumFile")
Option.builder("sumFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of sum file to read.") .desc("Required. Name of sum file to read.")
.build()); .build());
options.addOption( options.addOption(Option.builder("objFile")
Option.builder("objFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of OBJ shape file.") .desc("Required. Name of OBJ shape file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("pixelOffset")
Option.builder("pixelOffset")
.hasArg() .hasArg()
.desc( .desc(
"Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. " "Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. "
+ "x increases to the right and y increases down.") + "x increases to the right and y increases down.")
.build()); .build());
options.addOption( options.addOption(Option.builder("xRange")
Option.builder("xRange")
.hasArg() .hasArg()
.desc( .desc(
"Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). " "Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). "
+ "For example -50,50,5.") + "For example -50,50,5.")
.build()); .build());
options.addOption( options.addOption(Option.builder("yRange")
Option.builder("yRange")
.hasArg() .hasArg()
.desc( .desc(
"Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). " "Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). "
+ "For example -50,50,5.") + "For example -50,50,5.")
.build()); .build());
options.addOption( options.addOption(Option.builder("radius")
Option.builder("radius")
.hasArg() .hasArg()
.desc( .desc(
"Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.") "Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.")
.build()); .build());
options.addOption( options.addOption(Option.builder("distanceScale")
Option.builder("distanceScale")
.hasArg() .hasArg()
.desc( .desc(
"Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).") "Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).")
.build()); .build());
options.addOption( options.addOption(Option.builder("stats")
Option.builder("stats")
.desc("Print out statistics about range to all selected pixels.") .desc("Print out statistics about range to all selected pixels.")
.build()); .build());
return options; return options;
@@ -417,8 +409,7 @@ public class RangeFromSumFile implements TerrasaurTool {
if (checkRadius > 0) { if (checkRadius > 0) {
double midx = (xStart + xStop) / 2.; double midx = (xStart + xStop) / 2.;
double midy = (yStart + yStop) / 2.; double midy = (yStart + yStop) / 2.;
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius) if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius) continue;
continue;
} }
long cellID = rfsf.findIntercept(ix, iy).getKey(); long cellID = rfsf.findIntercept(ix, iy).getKey();
if (cellID > -1) System.out.println(rfsf); if (cellID > -1) System.out.println(rfsf);

View File

@@ -155,9 +155,13 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
for (int i = 0; i < points.GetNumberOfPoints(); i++) { for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3D thisPoint = new Vector3D(points.GetPoint(i)); Vector3D thisPoint = new Vector3D(points.GetPoint(i));
if (scale != null) if (scale != null)
thisPoint = thisPoint.subtract(center).scalarMultiply(scale).add(center); thisPoint = thisPoint
.subtract(center)
.scalarMultiply(scale)
.add(center);
if (rotation != null) if (rotation != null)
thisPoint = rotation.applyTo(thisPoint.subtract(center)).add(center); thisPoint =
rotation.applyTo(thisPoint.subtract(center)).add(center);
points.SetPoint(i, thisPoint.toArray()); points.SetPoint(i, thisPoint.toArray());
} }
} }
@@ -254,11 +258,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
* @return Brightness structure * @return Brightness structure
*/ */
private Brightness getBrightness( private Brightness getBrightness(
PhotometricFunction pf, PhotometricFunction pf, SmallBodyModel sbm, long intersect, Vector3D intersectPoint, boolean isDefault) {
SmallBodyModel sbm,
long intersect,
Vector3D intersectPoint,
boolean isDefault) {
Vector3D facetToCamera = cameraXYZ.subtract(intersectPoint); Vector3D facetToCamera = cameraXYZ.subtract(intersectPoint);
@@ -269,8 +269,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
double distFromCamera = facetToCamera.getNorm(); double distFromCamera = facetToCamera.getNorm();
// speeds up calculation along the limb. Need to combine all pixels in the ifov. // speeds up calculation along the limb. Need to combine all pixels in the ifov.
double kmPerPixel = double kmPerPixel = ifov * distFromCamera / Math.abs(FastMath.cos(Math.min(Math.toRadians(60), emission)));
ifov * distFromCamera / Math.abs(FastMath.cos(Math.min(Math.toRadians(60), emission)));
double sum = 0; double sum = 0;
@@ -304,16 +303,13 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
} }
} }
double albedo = (isDefault && albedoMap.containsKey(cell)) ? albedoMap.get(cell) : 1; double albedo = (isDefault && albedoMap.containsKey(cell)) ? albedoMap.get(cell) : 1;
sum += sum += albedo * pf.getValue(FastMath.cos(incidence), FastMath.cos(emission), FastMath.toDegrees(phase));
albedo
* pf.getValue(
FastMath.cos(incidence), FastMath.cos(emission), FastMath.toDegrees(phase));
} }
logger.printf( logger.printf(
Level.DEBUG, Level.DEBUG,
"Thread %d lat/lon %.2f/%.2f, %s, sum %f, cells %d, %.2f", "Thread %d lat/lon %.2f/%.2f, %s, sum %f, cells %d, %.2f",
Thread.currentThread().getId(), Thread.currentThread().threadId(),
Math.toDegrees(intersectPoint.getDelta()), Math.toDegrees(intersectPoint.getDelta()),
Math.toDegrees(intersectPoint.getAlpha()), Math.toDegrees(intersectPoint.getAlpha()),
intersectPoint.toString(), intersectPoint.toString(),
@@ -321,8 +317,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
cells.size(), cells.size(),
sum / cells.size()); sum / cells.size());
return new Brightness( return new Brightness(incidence, emission, phase, sum / cells.size(), distFromCamera, facetToCamera, normal);
incidence, emission, phase, sum / cells.size(), distFromCamera, facetToCamera, normal);
} }
class BrightnessCalculator implements Callable<Map<Integer, Brightness>> { class BrightnessCalculator implements Callable<Map<Integer, Brightness>> {
@@ -338,7 +333,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
@Override @Override
public Map<Integer, Brightness> call() throws Exception { public Map<Integer, Brightness> call() throws Exception {
logger.info("Thread {}: starting", Thread.currentThread().getId()); logger.info("Thread {}: starting", Thread.currentThread().threadId());
int xPixels = subPixel * nPixelsX; int xPixels = subPixel * nPixelsX;
@@ -352,8 +347,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
int i = index % xPixels; int i = index % xPixels;
Vector3D pixelDir = pixelToBodyFixed(((double) i) / subPixel, ((double) j) / subPixel); Vector3D pixelDir = pixelToBodyFixed(((double) i) / subPixel, ((double) j) / subPixel);
long intersect = long intersect = globalModel.computeRayIntersection(cameraXYZArray, pixelDir.toArray(), intersectPoint);
globalModel.computeRayIntersection(cameraXYZArray, pixelDir.toArray(), intersectPoint);
if (intersect > -1) { if (intersect > -1) {
@@ -371,16 +365,14 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
SmallBodyModel localModel = lmc.get(intersectPt3D); SmallBodyModel localModel = lmc.get(intersectPt3D);
if (localModel != null) { if (localModel != null) {
long localIntersect = long localIntersect = localModel.computeRayIntersection(
localModel.computeRayIntersection(
cameraXYZArray, pixelDir.toArray(), localIntersectPoint); cameraXYZArray, pixelDir.toArray(), localIntersectPoint);
if (localIntersect != -1) { if (localIntersect != -1) {
break; break;
} else { } else {
logger.debug( logger.debug(String.format(
String.format(
"Thread %d: No intersection with local model for pixel (%d,%d): lat/lon %.2f/%.2f, using global intersection %d %s", "Thread %d: No intersection with local model for pixel (%d,%d): lat/lon %.2f/%.2f, using global intersection %d %s",
Thread.currentThread().getId(), Thread.currentThread().threadId(),
i, i,
j, j,
Math.toDegrees(intersectPt3D.getDelta()), Math.toDegrees(intersectPt3D.getDelta()),
@@ -397,7 +389,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
} }
} }
logger.info("Thread {}: finished", Thread.currentThread().getId()); logger.info("Thread {}: finished", Thread.currentThread().threadId());
return brightness; return brightness;
} }
@@ -411,7 +403,8 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
Map<Integer, Brightness> brightness = new HashMap<>(); Map<Integer, Brightness> brightness = new HashMap<>();
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) { try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Integer> indices = IntStream.range(0, yPixels * xPixels).boxed().toList(); List<Integer> indices =
IntStream.range(0, yPixels * xPixels).boxed().toList();
int numPixels = indices.size(); int numPixels = indices.size();
@@ -467,7 +460,8 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) { try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Integer> indices = IntStream.range(0, yPixels * xPixels).boxed().toList(); List<Integer> indices =
IntStream.range(0, yPixels * xPixels).boxed().toList();
int numPixels = indices.size(); int numPixels = indices.size();
@@ -529,8 +523,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
} }
BufferedImage img = new BufferedImage(nPixelsX, nPixelsY, BufferedImage.TYPE_INT_RGB); BufferedImage img = new BufferedImage(nPixelsX, nPixelsY, BufferedImage.TYPE_INT_RGB);
img.createGraphics() img.createGraphics().drawImage(image.getScaledInstance(nPixelsX, nPixelsY, Image.SCALE_SMOOTH), 0, 0, null);
.drawImage(image.getScaledInstance(nPixelsX, nPixelsY, Image.SCALE_SMOOTH), 0, 0, null);
return img; return img;
} }
@@ -559,8 +552,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
double[] intersectPoint = new double[3]; double[] intersectPoint = new double[3];
Vector3D boresight = sumFile.boresight(); Vector3D boresight = sumFile.boresight();
long intersect = long intersect =
getGlobalModel() getGlobalModel().computeRayIntersection(cameraXYZ.toArray(), boresight.toArray(), intersectPoint);
.computeRayIntersection(cameraXYZ.toArray(), boresight.toArray(), intersectPoint);
if (intersect > 0) { if (intersect > 0) {
Vector3D nadirPt = new Vector3D(intersectPoint); Vector3D nadirPt = new Vector3D(intersectPoint);
double lat = nadirPt.getDelta(); double lat = nadirPt.getDelta();
@@ -573,17 +565,20 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
double range = cameraXYZ.subtract(nadirPt).getNorm(); double range = cameraXYZ.subtract(nadirPt).getNorm();
addMetaData("image.cell", "Index of center pixel cell", Long.toString(intersect)); addMetaData("image.cell", "Index of center pixel cell", Long.toString(intersect));
addMetaData("image.lat", "Center latitude", StringFunctions.toDegreesLat("%.2f ").apply(lat));
addMetaData( addMetaData(
"image.lon", "Center longitude", StringFunctions.toDegreesELon("%.2f ").apply(lon)); "image.lat",
"Center latitude",
StringFunctions.toDegreesLat("%.2f ").apply(lat));
addMetaData( addMetaData(
"image.inc", "Center incidence in degrees", String.format("%.2f", Math.toDegrees(inc))); "image.lon",
"Center longitude",
StringFunctions.toDegreesELon("%.2f ").apply(lon));
addMetaData("image.inc", "Center incidence in degrees", String.format("%.2f", Math.toDegrees(inc)));
addMetaData( addMetaData(
"image.ems", "image.ems",
"Center emission in degrees (may not be zero if facet is tilted)", "Center emission in degrees (may not be zero if facet is tilted)",
String.format("%.2f", Math.toDegrees(ems))); String.format("%.2f", Math.toDegrees(ems)));
addMetaData( addMetaData("image.phs", "Center phase in degrees", String.format("%.2f", Math.toDegrees(phs)));
"image.phs", "Center phase in degrees", String.format("%.2f", Math.toDegrees(phs)));
addMetaData("image.range", "Center point range in m", String.format("%.3f", range * 1e3)); addMetaData("image.range", "Center point range in m", String.format("%.3f", range * 1e3));
addMetaData( addMetaData(
"image.resolution", "image.resolution",
@@ -659,8 +654,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("albedoFile")
Option.builder("albedoFile")
.hasArg() .hasArg()
.desc( .desc(
""" """
@@ -674,8 +668,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ") .replaceAll("\\s+", " ")
.strip()) .strip())
.build()); .build());
options.addOption( options.addOption(Option.builder("localModels")
Option.builder("localModels")
.hasArgs() .hasArgs()
.desc( .desc(
""" """
@@ -695,46 +688,38 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ") .replaceAll("\\s+", " ")
.strip()) .strip())
.build()); .build());
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("model")
Option.builder("model")
.required() .required()
.hasArg() .hasArg()
.desc( .desc(
"Required. Default shape model filename. Supported formats are OBJ, PLT, PLY, STL, or VTK format.") "Required. Default shape model filename. Supported formats are OBJ, PLT, PLY, STL, or VTK format.")
.build()); .build());
options.addOption( options.addOption(Option.builder("numThreads")
Option.builder("numThreads")
.hasArg() .hasArg()
.desc("Number of threads to run in parallel when generating the image. Default is 2.") .desc("Number of threads to run in parallel when generating the image. Default is 2.")
.build()); .build());
options.addOption( options.addOption(Option.builder("photo")
Option.builder("photo")
.hasArg() .hasArg()
.desc(PhotometricFunction.getOptionString().trim() + " Default is OREX.") .desc(PhotometricFunction.getOptionString().trim() + " Default is OREX.")
.build()); .build());
options.addOption( options.addOption(Option.builder("output")
Option.builder("output")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of image file to write. Valid extensions are fits or png.") .desc("Required. Name of image file to write. Valid extensions are fits or png.")
.build()); .build());
options.addOption( options.addOption(Option.builder("rotateModel")
Option.builder("rotateModel")
.hasArg() .hasArg()
.desc( .desc(
""" """
@@ -745,13 +730,11 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ") .replaceAll("\\s+", " ")
.strip()) .strip())
.build()); .build());
options.addOption( options.addOption(Option.builder("scaleModel")
Option.builder("scaleModel")
.hasArg() .hasArg()
.desc("If present, factor to scale shape model. The center is unchanged.") .desc("If present, factor to scale shape model. The center is unchanged.")
.build()); .build());
options.addOption( options.addOption(Option.builder("subPixel")
Option.builder("subPixel")
.hasArg() .hasArg()
.desc( .desc(
""" """
@@ -764,8 +747,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ") .replaceAll("\\s+", " ")
.strip()) .strip())
.build()); .build());
options.addOption( options.addOption(Option.builder("sumFile")
Option.builder("sumFile")
.required() .required()
.hasArg() .hasArg()
.desc("Required. Name of sum file to read.") .desc("Required. Name of sum file to read.")
@@ -786,15 +768,11 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();
Double scale = Double scale = cl.hasOption("scaleModel") ? Double.parseDouble(cl.getOptionValue("scaleModel")) : null;
cl.hasOption("scaleModel") ? Double.parseDouble(cl.getOptionValue("scaleModel")) : null;
Rotation rotation = Rotation rotation =
cl.hasOption("rotateModel") cl.hasOption("rotateModel") ? RotationUtils.stringToRotation(cl.getOptionValue("rotateModel")) : null;
? RotationUtils.stringToRotation(cl.getOptionValue("rotateModel"))
: null;
RenderShapeFromSumFile app = RenderShapeFromSumFile app = new RenderShapeFromSumFile(cl.getOptionValue("model"), scale, rotation);
new RenderShapeFromSumFile(cl.getOptionValue("model"), scale, rotation);
SumFile sumFile = app.loadSumFile(cl.getOptionValue("sumFile")); SumFile sumFile = app.loadSumFile(cl.getOptionValue("sumFile"));
if (cl.hasOption("albedoFile")) app.loadAlbedoFile(cl.getOptionValue("albedoFile")); if (cl.hasOption("albedoFile")) app.loadAlbedoFile(cl.getOptionValue("albedoFile"));
@@ -805,10 +783,8 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
if (cl.hasOption("subPixel")) app.setSubPixel(Integer.parseInt(cl.getOptionValue("subPixel"))); if (cl.hasOption("subPixel")) app.setSubPixel(Integer.parseInt(cl.getOptionValue("subPixel")));
PhotometricFunction pf = PhotometricFunction.OREX1; PhotometricFunction pf = PhotometricFunction.OREX1;
if (cl.hasOption("photo")) if (cl.hasOption("photo")) pf = PhotometricFunction.getPhotometricFunction(cl.getOptionValue("photo"));
pf = PhotometricFunction.getPhotometricFunction(cl.getOptionValue("photo")); int numThreads = cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 2;
int numThreads =
cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 2;
String outputFilename = cl.getOptionValue("output"); String outputFilename = cl.getOptionValue("output");
String dirname = FilenameUtils.getPath(outputFilename); String dirname = FilenameUtils.getPath(outputFilename);
@@ -828,8 +804,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
Fits fits = new Fits(); Fits fits = new Fits();
ImageHDU imageHDU = (ImageHDU) Fits.makeHDU(app.getFits(pf, numThreads)); ImageHDU imageHDU = (ImageHDU) Fits.makeHDU(app.getFits(pf, numThreads));
Header header = imageHDU.getHeader(); Header header = imageHDU.getHeader();
header.addValue( header.addValue(DateTime.TIMESYS_UTC, app.metadata.get("image.utc").getValue(), "Time from the SUM file");
DateTime.TIMESYS_UTC, app.metadata.get("image.utc").getValue(), "Time from the SUM file");
header.addValue("TITLE", sumFile.picnm(), "Title of SUM file"); header.addValue("TITLE", sumFile.picnm(), "Title of SUM file");
header.addValue("PLANE1", "brightness", "from 0 to 1"); header.addValue("PLANE1", "brightness", "from 0 to 1");
header.addValue("PLANE2", "incidence", "degrees"); header.addValue("PLANE2", "incidence", "degrees");

View File

@@ -62,57 +62,87 @@ public class RotationConversion implements TerrasaurTool {
"""; """;
return TerrasaurTool.super.fullDescription(options, header, footer); return TerrasaurTool.super.fullDescription(options, header, footer);
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile").hasArg() options.addOption(Option.builder("logFile")
.desc("If present, save screen output to log file.").build()); .hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
sb.append(String.format("%s ", l.name())); options.addOption(Option.builder("logLevel")
options.addOption(Option.builder("logLevel").hasArg() .hasArg()
.desc("If present, print messages above selected priority. Valid values are " .desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.") + sb.toString().trim() + ". Default is INFO.")
.build()); .build());
options.addOption(Option.builder("angle").hasArg().desc("Rotation angle, in radians.").build()); options.addOption(Option.builder("angle")
options.addOption( .hasArg()
Option.builder("axis0").hasArg().desc("First element of rotation axis.").build()); .desc("Rotation angle, in radians.")
options.addOption( .build());
Option.builder("axis1").hasArg().desc("Second element of rotation axis.").build()); options.addOption(Option.builder("axis0")
options.addOption( .hasArg()
Option.builder("axis2").hasArg().desc("Third element of rotation axis.").build()); .desc("First element of rotation axis.")
options.addOption(Option.builder("cardanXYZ1").hasArg() .build());
.desc("Cardan angle for the first rotation (about the X axis) in radians.").build()); options.addOption(Option.builder("axis1")
options.addOption(Option.builder("cardanXYZ2").hasArg() .hasArg()
.desc("Cardan angle for the second rotation (about the Y axis) in radians.").build()); .desc("Second element of rotation axis.")
options.addOption(Option.builder("cardanXYZ3").hasArg() .build());
.desc("Cardan angle for the third rotation (about the Z axis) in radians.").build()); options.addOption(Option.builder("axis2")
options.addOption(Option.builder("eulerZXZ1").hasArg() .hasArg()
.desc("Euler angle for the first rotation (about the Z axis) in radians.").build()); .desc("Third element of rotation axis.")
options.addOption(Option.builder("eulerZXZ2").hasArg() .build());
options.addOption(Option.builder("cardanXYZ1")
.hasArg()
.desc("Cardan angle for the first rotation (about the X axis) in radians.")
.build());
options.addOption(Option.builder("cardanXYZ2")
.hasArg()
.desc("Cardan angle for the second rotation (about the Y axis) in radians.")
.build());
options.addOption(Option.builder("cardanXYZ3")
.hasArg()
.desc("Cardan angle for the third rotation (about the Z axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ1")
.hasArg()
.desc("Euler angle for the first rotation (about the Z axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ2")
.hasArg()
.desc("Euler angle for the second rotation (about the rotated X axis) in radians.") .desc("Euler angle for the second rotation (about the rotated X axis) in radians.")
.build()); .build());
options.addOption(Option.builder("eulerZXZ3").hasArg() options.addOption(Option.builder("eulerZXZ3")
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.").build()); .hasArg()
options.addOption( .desc("Euler angle for the third rotation (about the rotated Z axis) in radians.")
Option.builder("q0").hasArg().desc("Scalar term for quaternion: cos(theta/2)").build()); .build());
options.addOption(Option.builder("q1").hasArg() options.addOption(Option.builder("q0")
.desc("First vector term for quaternion: sin(theta/2) * V[0]").build()); .hasArg()
options.addOption(Option.builder("q2").hasArg() .desc("Scalar term for quaternion: cos(theta/2)")
.desc("Second vector term for quaternion: sin(theta/2) * V[1]").build()); .build());
options.addOption(Option.builder("q3").hasArg() options.addOption(Option.builder("q1")
.desc("Third vector term for quaternion: sin(theta/2) * V[2]").build()); .hasArg()
options.addOption(Option.builder("matrix").hasArg() .desc("First vector term for quaternion: sin(theta/2) * V[0]")
.build());
options.addOption(Option.builder("q2")
.hasArg()
.desc("Second vector term for quaternion: sin(theta/2) * V[1]")
.build());
options.addOption(Option.builder("q3")
.hasArg()
.desc("Third vector term for quaternion: sin(theta/2) * V[2]")
.build());
options.addOption(Option.builder("matrix")
.hasArg()
.desc("name of file containing rotation matrix to convert to Euler angles. " .desc("name of file containing rotation matrix to convert to Euler angles. "
+ "Format is 3x3 array in plain text separated by white space.") + "Format is 3x3 array in plain text separated by white space.")
.build()); .build());
options.addOption(Option.builder("anglesInDegrees").desc( options.addOption(Option.builder("anglesInDegrees")
"If present, input angles in degrees and print output angles in degrees. Default is false.") .desc("If present, input angles in degrees and print output angles in degrees. Default is false.")
.build()); return options; .build());
return options;
} }
public static void main(String[] args) throws Exception { public static void main(String[] args) throws Exception {
@@ -127,14 +157,11 @@ public class RotationConversion implements TerrasaurTool {
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml))); logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean inDegrees = cl.hasOption("anglesInDegrees"); boolean inDegrees = cl.hasOption("anglesInDegrees");
boolean axisAndAngle = cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1") boolean axisAndAngle =
&& cl.hasOption("axis2"); cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1") && cl.hasOption("axis2");
boolean cardanXYZ = boolean cardanXYZ = cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3"); boolean eulerZXZ = cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
boolean eulerZXZ = boolean quaternion = cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
boolean quaternion =
cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
boolean matrix = cl.hasOption("matrix"); boolean matrix = cl.hasOption("matrix");
if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) { if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) {
@@ -145,26 +172,25 @@ public class RotationConversion implements TerrasaurTool {
Rotation r = null; Rotation r = null;
if (matrix) { if (matrix) {
List<String> lines = List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
double[][] m = new double[3][3]; double[][] m = new double[3][3];
for (int i = 0; i < 3; i++) { for (int i = 0; i < 3; i++) {
String[] parts = lines.get(i).trim().split("\\s+"); String[] parts = lines.get(i).trim().split("\\s+");
for (int j = 0; j < 3; j++) for (int j = 0; j < 3; j++) m[i][j] = Double.parseDouble(parts[j].trim());
m[i][j] = Double.parseDouble(parts[j].trim());
} }
r = new Rotation(m, 1e-10); r = new Rotation(m, 1e-10);
} }
if (axisAndAngle) { if (axisAndAngle) {
double angle = Double.parseDouble(cl.getOptionValue("angle").trim()); double angle = Double.parseDouble(cl.getOptionValue("angle").trim());
if (inDegrees) if (inDegrees) angle = Math.toRadians(angle);
angle = Math.toRadians(angle);
r = new Rotation( r = new Rotation(
new Vector3D(Double.parseDouble(cl.getOptionValue("axis0").trim()), new Vector3D(
Double.parseDouble(cl.getOptionValue("axis0").trim()),
Double.parseDouble(cl.getOptionValue("axis1").trim()), Double.parseDouble(cl.getOptionValue("axis1").trim()),
Double.parseDouble(cl.getOptionValue("axis2").trim())), Double.parseDouble(cl.getOptionValue("axis2").trim())),
angle, RotationConvention.FRAME_TRANSFORM); angle,
RotationConvention.FRAME_TRANSFORM);
} }
if (cardanXYZ) { if (cardanXYZ) {
@@ -176,8 +202,7 @@ public class RotationConversion implements TerrasaurTool {
angle2 = Math.toRadians(angle2); angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3); angle3 = Math.toRadians(angle3);
} }
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
angle3);
} }
if (eulerZXZ) { if (eulerZXZ) {
@@ -189,15 +214,16 @@ public class RotationConversion implements TerrasaurTool {
angle2 = Math.toRadians(angle2); angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3); angle3 = Math.toRadians(angle3);
} }
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
angle3);
} }
if (quaternion) { if (quaternion) {
r = new Rotation(Double.parseDouble(cl.getOptionValue("q0").trim()), r = new Rotation(
Double.parseDouble(cl.getOptionValue("q0").trim()),
Double.parseDouble(cl.getOptionValue("q1").trim()), Double.parseDouble(cl.getOptionValue("q1").trim()),
Double.parseDouble(cl.getOptionValue("q2").trim()), Double.parseDouble(cl.getOptionValue("q2").trim()),
Double.parseDouble(cl.getOptionValue("q3").trim()), true); Double.parseDouble(cl.getOptionValue("q3").trim()),
true);
} }
double[][] m = r.getMatrix(); double[][] m = r.getMatrix();
@@ -207,19 +233,20 @@ public class RotationConversion implements TerrasaurTool {
System.out.println(matrixString); System.out.println(matrixString);
String axisAndAngleString = inDegrees String axisAndAngleString = inDegrees
? String.format("angle (degrees), axis:\n%g, %s", Math.toDegrees(r.getAngle()), ? String.format(
r.getAxis(RotationConvention.FRAME_TRANSFORM)) "angle (degrees), axis:\n%g, %s",
: String.format("angle (radians), axis:\n%g, %s", r.getAngle(), Math.toDegrees(r.getAngle()), r.getAxis(RotationConvention.FRAME_TRANSFORM))
r.getAxis(RotationConvention.FRAME_TRANSFORM)); : String.format(
"angle (radians), axis:\n%g, %s", r.getAngle(), r.getAxis(RotationConvention.FRAME_TRANSFORM));
System.out.println(axisAndAngleString); System.out.println(axisAndAngleString);
try { try {
double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM); double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM);
String cardanString = inDegrees String cardanString = inDegrees
? String.format("Cardan XYZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]), ? String.format(
Math.toDegrees(angles[1]), Math.toDegrees(angles[2])) "Cardan XYZ angles (degrees):\n%g, %g, %g",
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1], Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
angles[2]); : String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
System.out.println(cardanString); System.out.println(cardanString);
} catch (CardanEulerSingularityException e) { } catch (CardanEulerSingularityException e) {
System.out.println("Cardan angles: encountered singularity, cannot solve"); System.out.println("Cardan angles: encountered singularity, cannot solve");
@@ -228,10 +255,10 @@ public class RotationConversion implements TerrasaurTool {
try { try {
double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM); double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM);
String eulerString = inDegrees String eulerString = inDegrees
? String.format("Euler ZXZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]), ? String.format(
Math.toDegrees(angles[1]), Math.toDegrees(angles[2])) "Euler ZXZ angles (degrees):\n%g, %g, %g",
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1], Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
angles[2]); : String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
System.out.println(eulerString); System.out.println(eulerString);
} catch (CardanEulerSingularityException e) { } catch (CardanEulerSingularityException e) {
System.out.println("Euler angles: encountered singularity, cannot solve"); System.out.println("Euler angles: encountered singularity, cannot solve");

View File

@@ -44,7 +44,7 @@ import terrasaur.utils.math.MathConversions;
public class SPKFromSumFile implements TerrasaurTool { public class SPKFromSumFile implements TerrasaurTool {
private final static Logger logger = LogManager.getLogger(); private static final Logger logger = LogManager.getLogger();
@Override @Override
public String shortDescription() { public String shortDescription() {
@@ -54,7 +54,8 @@ public class SPKFromSumFile implements TerrasaurTool {
@Override @Override
public String fullDescription(Options options) { public String fullDescription(Options options) {
String header = ""; String header = "";
String footer = """ String footer =
"""
Given three or more sumfiles, fit a parabola to the spacecraft Given three or more sumfiles, fit a parabola to the spacecraft
trajectory in J2000 and create an input file for MKSPK. trajectory in J2000 and create an input file for MKSPK.
"""; """;
@@ -72,8 +73,9 @@ public class SPKFromSumFile implements TerrasaurTool {
private SPKFromSumFile() {} private SPKFromSumFile() {}
private SPKFromSumFile(Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap, private SPKFromSumFile(
double extend) throws SpiceException { Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap, double extend)
throws SpiceException {
this.observer = observer; this.observer = observer;
this.target = target; this.target = target;
this.bodyFixed = bodyFixed; this.bodyFixed = bodyFixed;
@@ -102,8 +104,9 @@ public class SPKFromSumFile implements TerrasaurTool {
* @param velocityIsJ2000 if true, user-supplied velocity is in J2000 frame * @param velocityIsJ2000 if true, user-supplied velocity is in J2000 frame
* @return command to run MKSPK * @return command to run MKSPK
*/ */
public String writeMKSPKFiles(String basename, List<String> comments, int degree, final Vector3 velocity, public String writeMKSPKFiles(
boolean velocityIsJ2000) throws SpiceException { String basename, List<String> comments, int degree, final Vector3 velocity, boolean velocityIsJ2000)
throws SpiceException {
String commentFile = basename + "-comments.txt"; String commentFile = basename + "-comments.txt";
String setupFile = basename + ".setup"; String setupFile = basename + ".setup";
@@ -112,23 +115,27 @@ public class SPKFromSumFile implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(commentFile)) { try (PrintWriter pw = new PrintWriter(commentFile)) {
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
if (!comments.isEmpty()) { if (!comments.isEmpty()) {
for (String comment : comments) for (String comment : comments) sb.append(comment).append("\n");
sb.append(comment).append("\n");
sb.append("\n"); sb.append("\n");
} }
sb.append(String.format("This SPK for %s was generated by fitting a parabola to each component of the " + "SCOBJ vector from " + "the following sumfiles:\n", target)); sb.append(String.format(
"This SPK for %s was generated by fitting a parabola to each component of the "
+ "SCOBJ vector from " + "the following sumfiles:\n",
target));
for (String sumFile : sumFilenames.values()) { for (String sumFile : sumFilenames.values()) {
sb.append(String.format("\t%s\n", sumFile)); sb.append(String.format("\t%s\n", sumFile));
} }
sb.append("The SCOBJ vector was transformed to J2000 and an aberration correction "); sb.append("The SCOBJ vector was transformed to J2000 and an aberration correction ");
sb.append(String.format("was applied to find the geometric position relative to %s before the parabola " + "fit. ", target.getName())); sb.append(String.format(
sb.append(String.format("The period covered by this SPK is %s to %s.", "was applied to find the geometric position relative to %s before the parabola " + "fit. ",
target.getName()));
sb.append(String.format(
"The period covered by this SPK is %s to %s.",
new TDBTime(interval.getBegin()).toUTCString("ISOC", 3), new TDBTime(interval.getBegin()).toUTCString("ISOC", 3),
new TDBTime(interval.getEnd()).toUTCString("ISOC", 3))); new TDBTime(interval.getEnd()).toUTCString("ISOC", 3)));
String allComments = sb.toString(); String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n")) for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) { } catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
} }
@@ -196,23 +203,37 @@ public class SPKFromSumFile implements TerrasaurTool {
PolynomialFunction zPos = new PolynomialFunction(zCoeff); PolynomialFunction zPos = new PolynomialFunction(zCoeff);
PolynomialFunction zVel = zPos.polynomialDerivative(); PolynomialFunction zVel = zPos.polynomialDerivative();
logger.info("Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:", logger.info(
observer.getName(), target.getName()); "Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
observer.getName(),
target.getName());
StringBuilder xMsg = new StringBuilder(String.format("X = %e ", xCoeff[0])); StringBuilder xMsg = new StringBuilder(String.format("X = %e ", xCoeff[0]));
StringBuilder yMsg = new StringBuilder(String.format("Y = %e ", yCoeff[0])); StringBuilder yMsg = new StringBuilder(String.format("Y = %e ", yCoeff[0]));
StringBuilder zMsg = new StringBuilder(String.format("Z = %e ", zCoeff[0])); StringBuilder zMsg = new StringBuilder(String.format("Z = %e ", zCoeff[0]));
for (int i = 1; i <= degree; i++) { for (int i = 1; i <= degree; i++) {
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(xCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" "); xMsg.append(xCoeff[i] < 0 ? "- " : "+ ")
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(yCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" "); .append(String.format("%e ", Math.abs(xCoeff[i])))
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(zCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" "); .append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(yCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(zCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
} }
logger.info(xMsg); logger.info(xMsg);
logger.info(yMsg); logger.info(yMsg);
logger.info(zMsg); logger.info(zMsg);
logger.debug(""); logger.debug("");
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration " + logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration "
"correction=NONE for fit."); + "correction=NONE for fit.");
for (Double t : sumFiles.keySet()) { for (Double t : sumFiles.keySet()) {
TDBTime tdb = new TDBTime(t); TDBTime tdb = new TDBTime(t);
SumFile sumFile = sumFiles.get(t); SumFile sumFile = sumFiles.get(t);
@@ -234,32 +255,34 @@ public class SPKFromSumFile implements TerrasaurTool {
double vx = -xVel.value(t); double vx = -xVel.value(t);
double vy = -yVel.value(t); double vy = -yVel.value(t);
double vz = -zVel.value(t); double vz = -zVel.value(t);
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t), pw.printf(
-zPos.value(t), vx, vy, vz); "%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
} else { } else {
Vector3 thisVelocity = new Vector3(velocity); Vector3 thisVelocity = new Vector3(velocity);
if (!velocityIsJ2000) { if (!velocityIsJ2000) {
TDBTime tdb = new TDBTime(t); TDBTime tdb = new TDBTime(t);
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity); thisVelocity =
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
} }
double vx = thisVelocity.getElt(0); double vx = thisVelocity.getElt(0);
double vy = thisVelocity.getElt(1); double vy = thisVelocity.getElt(1);
double vz = thisVelocity.getElt(2); double vz = thisVelocity.getElt(2);
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t), pw.printf(
-zPos.value(t), vx, vy, vz); "%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
} }
} }
} catch (FileNotFoundException e) { } catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
} }
try (PrintWriter pw = new PrintWriter(basename + ".csv")) { try (PrintWriter pw = new PrintWriter(basename + ".csv")) {
pw.println("# Note: fit quantities are without light time or aberration corrections"); pw.println("# Note: fit quantities are without light time or aberration corrections");
pw.println("# SCOBJ"); pw.println("# SCOBJ");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x," + pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x,"
" y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric " + + " y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric "
"J2000) x, y, z"); + "J2000) x, y, z");
for (Double t : sumFiles.keySet()) { for (Double t : sumFiles.keySet()) {
SumFile sumFile = sumFiles.get(t); SumFile sumFile = sumFiles.get(t);
pw.printf("%s,", sumFile.utcString()); pw.printf("%s,", sumFile.utcString());
@@ -300,8 +323,8 @@ public class SPKFromSumFile implements TerrasaurTool {
pw.println(); pw.println();
} }
pw.println("\n# Velocity"); pw.println("\n# Velocity");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, " + pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, "
"y, z, Fit (J2000) x, y, z"); + "y, z, Fit (J2000) x, y, z");
for (Double t : sumFiles.keySet()) { for (Double t : sumFiles.keySet()) {
SumFile sumFile = sumFiles.get(t); SumFile sumFile = sumFiles.get(t);
pw.printf("%s,", sumFile.utcString()); pw.printf("%s,", sumFile.utcString());
@@ -327,7 +350,8 @@ public class SPKFromSumFile implements TerrasaurTool {
if (velocity != null) { if (velocity != null) {
Vector3 thisVelocity = new Vector3(velocity); Vector3 thisVelocity = new Vector3(velocity);
if (!velocityIsJ2000) { if (!velocityIsJ2000) {
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity); thisVelocity =
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
} }
double vx = thisVelocity.getElt(0); double vx = thisVelocity.getElt(0);
double vy = thisVelocity.getElt(1); double vy = thisVelocity.getElt(1);
@@ -335,8 +359,8 @@ public class SPKFromSumFile implements TerrasaurTool {
velJ2000 = new Vector3(vx, vy, vz); velJ2000 = new Vector3(vx, vy, vz);
} }
StateVector stateJ2000 = new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)), StateVector stateJ2000 =
velJ2000); new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)), velJ2000);
velBodyFixed = j2000ToBodyFixed.mxv(stateJ2000).getVector3(1); velBodyFixed = j2000ToBodyFixed.mxv(stateJ2000).getVector3(1);
pw.printf("%s, ", velBodyFixed.getElt(0)); pw.printf("%s, ", velBodyFixed.getElt(0));
@@ -357,19 +381,39 @@ public class SPKFromSumFile implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("degree").hasArg().desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.").build()); options.addOption(Option.builder("degree")
options.addOption(Option.builder("extend").hasArg().desc("Extend SPK past the last sumFile by <arg> seconds. " .hasArg()
+ " Default is zero.").build()); .desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.")
options.addOption(Option.builder("frame").hasArg().desc("Name of body fixed frame. This will default to the " .build());
+ "target's body fixed frame.").build()); options.addOption(Option.builder("extend")
options.addOption(Option.builder("logFile").hasArg().desc("If present, save screen output to log file.").build()); .hasArg()
.desc("Extend SPK past the last sumFile by <arg> seconds. " + " Default is zero.")
.build());
options.addOption(Option.builder("frame")
.hasArg()
.desc("Name of body fixed frame. This will default to the " + "target's body fixed frame.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
sb.append(String.format("%s ", l.name())); options.addOption(Option.builder("logLevel")
options.addOption(Option.builder("logLevel").hasArg().desc("If present, print messages above selected " + .hasArg()
"priority. Valid values are " + sb.toString().trim() + ". Default is INFO.").build()); .desc("If present, print messages above selected " + "priority. Valid values are "
options.addOption(Option.builder("observer").required().hasArg().desc("Required. SPICE ID for the observer.").build()); + sb.toString().trim() + ". Default is INFO.")
options.addOption(Option.builder("sumFile").hasArg().required().desc(""" .build());
options.addOption(Option.builder("observer")
.required()
.hasArg()
.desc("Required. SPICE ID for the observer.")
.build());
options.addOption(Option.builder("sumFile")
.hasArg()
.required()
.desc(
"""
File listing sumfiles to read. This is a text file, File listing sumfiles to read. This is a text file,
one per line. You can include an optional weight one per line. You can include an optional weight
after each filename. The default weight is 1.0. after each filename. The default weight is 1.0.
@@ -385,14 +429,31 @@ public class SPKFromSumFile implements TerrasaurTool {
D717506131G0.SUM D717506131G0.SUM
# Weight this last image less than the others # Weight this last image less than the others
D717506132G0.SUM 0.25 D717506132G0.SUM 0.25
""").build()); """)
options.addOption(Option.builder("spice").required().hasArgs().desc("Required. SPICE metakernel file " + .build());
"containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by " options.addOption(Option.builder("spice")
+ "whitespace.").build()); .required()
options.addOption(Option.builder("target").required().hasArg().desc("Required. SPICE ID for the target.").build()); .hasArgs()
options.addOption(Option.builder("velocity").hasArgs().desc("Spacecraft velocity relative to target in the " + "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to " + "take the derivative of the fit position. Specify as three floating point values in km/sec," + "separated by whitespace.").build()); .desc("Required. SPICE metakernel file "
options.addOption(Option.builder("velocityJ2000").desc("If present, argument to -velocity is in J2000 frame. " + "containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
+ " Ignored if -velocity is not set.").build()); return options; + "whitespace.")
.build());
options.addOption(Option.builder("target")
.required()
.hasArg()
.desc("Required. SPICE ID for the target.")
.build());
options.addOption(Option.builder("velocity")
.hasArgs()
.desc("Spacecraft velocity relative to target in the "
+ "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to "
+ "take the derivative of the fit position. Specify as three floating point values in km/sec,"
+ "separated by whitespace.")
.build());
options.addOption(Option.builder("velocityJ2000")
.desc("If present, argument to -velocity is in J2000 frame. " + " Ignored if -velocity is not set.")
.build());
return options;
} }
public static void main(String[] args) throws SpiceException { public static void main(String[] args) throws SpiceException {
@@ -408,11 +469,9 @@ public class SPKFromSumFile implements TerrasaurTool {
NativeLibraryLoader.loadSpiceLibraries(); NativeLibraryLoader.loadSpiceLibraries();
final double extend = cl.hasOption("extend") ? Double.parseDouble(cl.getOptionValue("extend")) : 0; final double extend = cl.hasOption("extend") ? Double.parseDouble(cl.getOptionValue("extend")) : 0;
for (String kernel : cl.getOptionValues("spice")) for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
KernelDatabase.load(kernel);
Body observer = new Body(cl.getOptionValue("observer")); Body observer = new Body(cl.getOptionValue("observer"));
Body target = new Body(cl.getOptionValue("target")); Body target = new Body(cl.getOptionValue("target"));
@@ -468,5 +527,4 @@ public class SPKFromSumFile implements TerrasaurTool {
logger.info("Finished."); logger.info("Finished.");
} }
} }

View File

@@ -97,23 +97,19 @@ public class ShapeFormatConverter implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("centerOfRotation")
Option.builder("centerOfRotation")
.hasArg() .hasArg()
.desc( .desc(
"Subtract this point before applying rotation matrix, add back after. " "Subtract this point before applying rotation matrix, add back after. "
+ "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).") + "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).")
.build()); .build());
options.addOption( options.addOption(Option.builder("decimate")
Option.builder("decimate")
.hasArg() .hasArg()
.desc( .desc("Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
"Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and the argument to -decimate is 0.90, " + "For example, if a model has 100 facets and the argument to -decimate is 0.90, "
+ "there will be approximately 10 facets after the decimation.") + "there will be approximately 10 facets after the decimation.")
.build()); .build());
options.addOption( options.addOption(Option.builder("input")
Option.builder("input")
.required() .required()
.hasArg() .hasArg()
.desc( .desc(
@@ -122,68 +118,51 @@ public class ShapeFormatConverter implements TerrasaurTool {
+ "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. " + "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. "
+ "Transformed point will be written to stdout in the same format as the input string.") + "Transformed point will be written to stdout in the same format as the input string.")
.build()); .build());
options.addOption( options.addOption(Option.builder("inputFormat")
Option.builder("inputFormat")
.hasArg() .hasArg()
.desc( .desc("Format of input file. If not present format will be inferred from inputFile extension.")
"Format of input file. If not present format will be inferred from inputFile extension.")
.build()); .build());
options.addOption( options.addOption(Option.builder("output")
Option.builder("output")
.hasArg() .hasArg()
.desc( .desc("Required for all but single point input. Name of transformed file. "
"Required for all but single point input. Name of transformed file. "
+ "Extension must be obj, plt, sbmt, stl, sum, or vtk.") + "Extension must be obj, plt, sbmt, stl, sum, or vtk.")
.build()); .build());
options.addOption( options.addOption(Option.builder("outputFormat")
Option.builder("outputFormat")
.hasArg() .hasArg()
.desc( .desc("Format of output file. If not present format will be inferred from outputFile extension.")
"Format of output file. If not present format will be inferred from outputFile extension.")
.build()); .build());
options.addOption( options.addOption(Option.builder("register")
Option.builder("register")
.hasArg() .hasArg()
.desc("Use SVD to transform input file to best align with register file.") .desc("Use SVD to transform input file to best align with register file.")
.build()); .build());
options.addOption( options.addOption(Option.builder("rotate")
Option.builder("rotate")
.hasArg() .hasArg()
.desc( .desc("Rotate surface points and spacecraft position. "
"Rotate surface points and spacecraft position. "
+ "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) " + "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) "
+ "separated by commas.") + "separated by commas.")
.build()); .build());
options.addOption( options.addOption(Option.builder("rotateToPrincipalAxes")
Option.builder("rotateToPrincipalAxes")
.desc("Rotate body to align along its principal axes of inertia.") .desc("Rotate body to align along its principal axes of inertia.")
.build()); .build());
options.addOption( options.addOption(Option.builder("scale")
Option.builder("scale")
.hasArg() .hasArg()
.desc( .desc("Scale the shape model by <arg>. This can either be one value or three "
"Scale the shape model by <arg>. This can either be one value or three "
+ "separated by commas. One value scales all three axes uniformly, " + "separated by commas. One value scales all three axes uniformly, "
+ "three values scale the x, y, and z axes respectively. For example, " + "three values scale the x, y, and z axes respectively. For example, "
+ "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the " + "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the "
+ "y dimension by 0.25, the z dimension by 1.5.") + "y dimension by 0.25, the z dimension by 1.5.")
.build()); .build());
options.addOption( options.addOption(Option.builder("translate")
Option.builder("translate")
.hasArg() .hasArg()
.desc( .desc("Translate surface points and spacecraft position. "
"Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas.") + "Specify by three floating point numbers separated by commas.")
.build()); .build());
options.addOption( options.addOption(Option.builder("translateToCenter")
Option.builder("translateToCenter")
.desc("Translate body so that its center of mass is at the origin.") .desc("Translate body so that its center of mass is at the origin.")
.build()); .build());
options.addOption( options.addOption(Option.builder("transform")
Option.builder("transform")
.hasArg() .hasArg()
.desc( .desc("Translate and rotate surface points and spacecraft position. "
"Translate and rotate surface points and spacecraft position. "
+ "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix " + "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix "
+ "is the rotation matrix. The top three entries in the right hand column are the translation " + "is the rotation matrix. The top three entries in the right hand column are the translation "
+ "vector. The bottom row is always 0 0 0 1.") + "vector. The bottom row is always 0 0 0 1.")
@@ -215,8 +194,9 @@ public class ShapeFormatConverter implements TerrasaurTool {
String extension = null; String extension = null;
if (cl.hasOption("inputFormat")) { if (cl.hasOption("inputFormat")) {
try { try {
extension = extension = FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase()).name().toLowerCase(); .name()
.toLowerCase();
} catch (IllegalArgumentException e) { } catch (IllegalArgumentException e) {
logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat")); logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat"));
} }
@@ -248,8 +228,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
polydata = new vtkPolyData(); polydata = new vtkPolyData();
polydata.SetPoints(points); polydata.SetPoints(points);
if (params.length == 2) { if (params.length == 2) {
double[] array = double[] array = new Vector3D(
new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())), Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim()))) Math.toRadians(Double.parseDouble(params[1].trim())))
.toArray(); .toArray();
@@ -261,8 +240,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
points.InsertNextPoint(array); points.InsertNextPoint(array);
coordType = COORDTYPE.XYZ; coordType = COORDTYPE.XYZ;
} else { } else {
logger.error( logger.error("Can't read input shape model {} with format {}", filename, extension.toUpperCase());
"Can't read input shape model {} with format {}", filename, extension.toUpperCase());
System.exit(0); System.exit(0);
} }
} }
@@ -293,12 +271,10 @@ public class ShapeFormatConverter implements TerrasaurTool {
for (Option option : cl.getOptions()) { for (Option option : cl.getOptions()) {
if (option.getOpt().equals("centerOfRotation")) if (option.getOpt().equals("centerOfRotation"))
centerOfRotation = centerOfRotation =
MathConversions.toVector3( MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
if (option.getOpt().equals("rotate")) if (option.getOpt().equals("rotate"))
rotation = rotation = MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
if (option.getOpt().equals("scale")) { if (option.getOpt().equals("scale")) {
String scaleString = cl.getOptionValue("scale"); String scaleString = cl.getOptionValue("scale");
@@ -310,8 +286,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
} }
if (option.getOpt().equals("translate")) if (option.getOpt().equals("translate"))
translation = translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
if (option.getOpt().equals("transform")) { if (option.getOpt().equals("transform")) {
List<String> lines = List<String> lines =
@@ -359,7 +334,8 @@ public class ShapeFormatConverter implements TerrasaurTool {
vtkPoints points = polydata.GetPoints(); vtkPoints points = polydata.GetPoints();
double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3]; double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++) for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsA[i] = new Vector3D(points.GetPoint(i)).subtract(centerA).toArray(); pointsA[i] =
new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
points = registeredPolydata.GetPoints(); points = registeredPolydata.GetPoints();
if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) { if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) {
@@ -369,7 +345,8 @@ public class ShapeFormatConverter implements TerrasaurTool {
double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3]; double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++) for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsB[i] = new Vector3D(points.GetPoint(i)).subtract(centerB).toArray(); pointsB[i] =
new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
double[][] H = new double[3][3]; double[][] H = new double[3][3];
for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) { for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) {
@@ -400,8 +377,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
if (sumFile != null) { if (sumFile != null) {
if (rotation != null && translation != null) if (rotation != null && translation != null)
sumFile.transform( sumFile.transform(MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
} else { } else {
Vector3 center; Vector3 center;
@@ -472,13 +448,16 @@ public class ShapeFormatConverter implements TerrasaurTool {
extension = null; extension = null;
if (cl.hasOption("outputFormat")) { if (cl.hasOption("outputFormat")) {
try { try {
extension = extension = FORMATS.valueOf(
FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase()).name().toLowerCase(); cl.getOptionValue("outputFormat").toUpperCase())
.name()
.toLowerCase();
} catch (IllegalArgumentException e) { } catch (IllegalArgumentException e) {
logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat")); logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat"));
} }
} }
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase(); if (extension == null)
extension = FilenameUtils.getExtension(filename).toLowerCase();
switch (extension) { switch (extension) {
case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename); case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename);
@@ -517,10 +496,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
} }
} }
default -> { default -> {
logger.error( logger.error("Can't write output shape model {} with format {}", filename, extension.toUpperCase());
"Can't write output shape model {} with format {}",
filename,
extension.toUpperCase());
System.exit(0); System.exit(0);
} }
} }

View File

@@ -95,10 +95,8 @@ public class SumFilesFromFlyby implements TerrasaurTool {
this.sumfile = sumfile; this.sumfile = sumfile;
// given phase angle p, closest approach point is (cos p, sin p) // given phase angle p, closest approach point is (cos p, sin p)
Vector3D closestApproach = Vector3D closestApproach = new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance); Vector3D velocity = new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
Vector3D velocity =
new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
/*- /*-
* Assumptions: * Assumptions:
@@ -143,8 +141,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
return s; return s;
} }
private String writeMSOPCKFiles( private String writeMSOPCKFiles(String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
throws SpiceException { throws SpiceException {
File commentFile = new File(basename + "_msopck-comments.txt"); File commentFile = new File(basename + "_msopck-comments.txt");
@@ -195,10 +192,8 @@ public class SumFilesFromFlyby implements TerrasaurTool {
double imageTime = t + t0; double imageTime = t + t0;
Vector3D scPos = scPosFunc.apply(t); Vector3D scPos = scPosFunc.apply(t);
SpiceQuaternion q = SpiceQuaternion q = new SpiceQuaternion(
new SpiceQuaternion( MathConversions.toMatrix33(RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
MathConversions.toMatrix33(
RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
attitudeMap.put(imageTime, q); attitudeMap.put(imageTime, q);
} }
@@ -210,11 +205,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
Vector3 v = q.getVector(); Vector3 v = q.getVector();
pw.printf( pw.printf(
"%s %.14e %.14e %.14e %.14e\n", "%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6), new TDBTime(t).toUTCString("ISOC", 6), q.getScalar(), v.getElt(0), v.getElt(1), v.getElt(2));
q.getScalar(),
v.getElt(0),
v.getElt(1),
v.getElt(2));
} }
} catch (IOException e) { } catch (IOException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
@@ -230,8 +221,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
* @param bundle SPICE bundle * @param bundle SPICE bundle
* @return command to run MKSPK * @return command to run MKSPK
*/ */
private String writeMKSPKFiles( private String writeMKSPKFiles(String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
String commentFile = basename + "_mkspk-comments.txt"; String commentFile = basename + "_mkspk-comments.txt";
String setupFile = basename + "_mkspk.setup"; String setupFile = basename + "_mkspk.setup";
@@ -275,94 +265,72 @@ public class SumFilesFromFlyby implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(inputFile)) { try (PrintWriter pw = new PrintWriter(inputFile)) {
for (UnwritableInterval interval : intervals) { for (UnwritableInterval interval : intervals) {
for (double t = interval.getBegin(); for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
t < interval.getEnd();
t += interval.getLength() / 100) {
Vector3D scPos = scPosFunc.apply(t); Vector3D scPos = scPosFunc.apply(t);
Vector3D scVel = scVelFunc.apply(t); Vector3D scVel = scVelFunc.apply(t);
pw.printf( pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", "%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t + t0, t + t0, scPos.getX(), scPos.getY(), scPos.getZ(), scVel.getX(), scVel.getY(), scVel.getZ());
scPos.getX(),
scPos.getY(),
scPos.getZ(),
scVel.getX(),
scVel.getY(),
scVel.getZ());
} }
} }
} catch (FileNotFoundException e) { } catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e); logger.error(e.getLocalizedMessage(), e);
} }
return String.format( return String.format("mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
"mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("distance")
Option.builder("distance")
.hasArg() .hasArg()
.required() .required()
.desc("Required. Flyby distance from body center in km.") .desc("Required. Flyby distance from body center in km.")
.build()); .build());
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption( options.addOption(Option.builder("mk")
Option.builder("mk")
.hasArg() .hasArg()
.desc( .desc(
"Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.") "Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.")
.build()); .build());
options.addOption( options.addOption(Option.builder("mkspk")
Option.builder("mkspk")
.hasArg() .hasArg()
.desc( .desc(
"If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.") "If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.")
.build()); .build());
options.addOption( options.addOption(Option.builder("msopck")
Option.builder("msopck")
.desc("If present, create input files for MSOPCK. This option requires -lsk.") .desc("If present, create input files for MSOPCK. This option requires -lsk.")
.build()); .build());
options.addOption( options.addOption(Option.builder("phase")
Option.builder("phase")
.hasArg() .hasArg()
.required() .required()
.desc("Required. Phase angle at closest approach in degrees.") .desc("Required. Phase angle at closest approach in degrees.")
.build()); .build());
options.addOption( options.addOption(Option.builder("speed")
Option.builder("speed")
.hasArg() .hasArg()
.required() .required()
.desc("Required. Flyby speed at closest approach in km/s.") .desc("Required. Flyby speed at closest approach in km/s.")
.build()); .build());
options.addOption( options.addOption(Option.builder("template")
Option.builder("template")
.hasArg() .hasArg()
.required() .required()
.desc( .desc("Required. An existing sumfile to use as a template. Camera parameters are taken from this "
"Required. An existing sumfile to use as a template. Camera parameters are taken from this "
+ "file, while camera position and orientation are calculated.") + "file, while camera position and orientation are calculated.")
.build()); .build());
options.addOption( options.addOption(Option.builder("times")
Option.builder("times")
.hasArgs() .hasArgs()
.desc( .desc("If present, list of times separated by white space. In seconds, relative to closest approach.")
"If present, list of times separated by white space. In seconds, relative to closest approach.")
.build()); .build());
return options; return options;
} }
@@ -388,8 +356,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
String base = FilenameUtils.getBaseName(sumFileTemplate); String base = FilenameUtils.getBaseName(sumFileTemplate);
String ext = FilenameUtils.getExtension(sumFileTemplate); String ext = FilenameUtils.getExtension(sumFileTemplate);
SumFilesFromFlyby app = SumFilesFromFlyby app = new SumFilesFromFlyby(
new SumFilesFromFlyby(
SumFile.fromFile(new File(sumFileTemplate)), SumFile.fromFile(new File(sumFileTemplate)),
Double.parseDouble(cl.getOptionValue("distance")), Double.parseDouble(cl.getOptionValue("distance")),
Math.toRadians(phase), Math.toRadians(phase),
@@ -403,22 +370,19 @@ public class SumFilesFromFlyby implements TerrasaurTool {
SpiceBundle bundle = null; SpiceBundle bundle = null;
if (cl.hasOption("mk")) { if (cl.hasOption("mk")) {
NativeLibraryLoader.loadSpiceLibraries(); NativeLibraryLoader.loadSpiceLibraries();
bundle = bundle = new SpiceBundle.Builder()
new SpiceBundle.Builder()
.addMetakernels(Collections.singletonList(cl.getOptionValue("mk"))) .addMetakernels(Collections.singletonList(cl.getOptionValue("mk")))
.build(); .build();
KernelDatabase.load(cl.getOptionValue("mk")); KernelDatabase.load(cl.getOptionValue("mk"));
} }
TimeConversion tc = TimeConversion tc = bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
for (double t : times) { for (double t : times) {
SumFile s = app.getSumFile(t); SumFile s = app.getSumFile(t);
try (PrintWriter pw = try (PrintWriter pw =
new PrintWriter( new PrintWriter(String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
pw.println(s); pw.println(s);

View File

@@ -101,37 +101,31 @@ public class TileLookup implements TerrasaurTool {
if (fullPath.trim().isEmpty()) fullPath = "."; if (fullPath.trim().isEmpty()) fullPath = ".";
if (!fullPath.endsWith(File.separator)) fullPath += File.separator; if (!fullPath.endsWith(File.separator)) fullPath += File.separator;
return String.format( return String.format(
"%s%s.%d.%s", "%s%s.%d.%s", fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("nTiles")
Option.builder("nTiles")
.hasArg() .hasArg()
.required() .required()
.desc("Number of points covering the sphere.") .desc("Number of points covering the sphere.")
.build()); .build());
options.addOption( options.addOption(Option.builder("printCoords")
Option.builder("printCoords")
.desc( .desc(
"Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.") "Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.")
.build()); .build());
options.addOption( options.addOption(Option.builder("printDistance")
Option.builder("printDistance")
.hasArg() .hasArg()
.desc( .desc(
"Print a table of points sorted by distance from the input point. " "Print a table of points sorted by distance from the input point. "
+ "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.") + "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.")
.build()); .build());
options.addOption( options.addOption(Option.builder("printStats")
Option.builder("printStats")
.desc( .desc(
"Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.") "Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.")
.build()); .build());
options.addOption( options.addOption(Option.builder("png")
Option.builder("png")
.hasArg() .hasArg()
.desc( .desc(
"Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.") "Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.")
@@ -168,8 +162,7 @@ public class TileLookup implements TerrasaurTool {
} }
if (cl.hasOption("printStats")) { if (cl.hasOption("printStats")) {
System.out.println( System.out.println("Statistics on distances between each point and its nearest neighbor (degrees):");
"Statistics on distances between each point and its nearest neighbor (degrees):");
System.out.println(fs.getDistanceStats()); System.out.println(fs.getDistanceStats());
System.exit(0); System.exit(0);
} }
@@ -196,30 +189,28 @@ public class TileLookup implements TerrasaurTool {
} }
if (cl.hasOption("png")) { if (cl.hasOption("png")) {
PlotConfig config = ImmutablePlotConfig.builder().width(1000).height(1000).build(); PlotConfig config =
ImmutablePlotConfig.builder().width(1000).height(1000).build();
String title = String.format("Fibonacci Sphere, n = %d, ", npts); String title = String.format("Fibonacci Sphere, n = %d, ", npts);
Map<String, Projection> projections = new LinkedHashMap<>(); Map<String, Projection> projections = new LinkedHashMap<>();
projections.put( projections.put(
title + "Rectangular Projection", title + "Rectangular Projection", new ProjectionRectangular(config.width(), config.height() / 2));
new ProjectionRectangular(config.width(), config.height() / 2));
projections.put( projections.put(
title + "Orthographic Projection (0, 90)", title + "Orthographic Projection (0, 90)",
new ProjectionOrthographic( new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0))); config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0)));
projections.put( projections.put(
title + "Orthographic Projection (0, 0)", title + "Orthographic Projection (0, 0)",
new ProjectionOrthographic( new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
projections.put( projections.put(
title + "Orthographic Projection (90, 0)", title + "Orthographic Projection (90, 0)",
new ProjectionOrthographic( new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2))); config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2)));
projections.put( projections.put(
title + "Orthographic Projection (180, 0)", title + "Orthographic Projection (180, 0)",
new ProjectionOrthographic( new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
projections.put( projections.put(
title + "Orthographic Projection (270, 0)", title + "Orthographic Projection (270, 0)",
new ProjectionOrthographic( new ProjectionOrthographic(
@@ -235,7 +226,11 @@ public class TileLookup implements TerrasaurTool {
colors.add(Color.BLACK); colors.add(Color.BLACK);
for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i)); for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i));
colors.add(Color.WHITE); colors.add(Color.WHITE);
ramp = ImmutableColorRamp.builder().min(0).max(nColors).colors(colors).build(); ramp = ImmutableColorRamp.builder()
.min(0)
.max(nColors)
.colors(colors)
.build();
double radius = fs.getDistanceStats().getMax(); double radius = fs.getDistanceStats().getMax();
ramp = ColorRamp.createLinear(0, radius).addLimitColors(); ramp = ColorRamp.createLinear(0, radius).addLimitColors();
@@ -246,8 +241,15 @@ public class TileLookup implements TerrasaurTool {
Projection p = projections.get(t); Projection p = projections.get(t);
if (p instanceof ProjectionRectangular) if (p instanceof ProjectionRectangular)
config = ImmutablePlotConfig.builder().from(config).height(500).build(); config = ImmutablePlotConfig.builder()
else config = ImmutablePlotConfig.builder().from(config).height(1000).build(); .from(config)
.height(500)
.build();
else
config = ImmutablePlotConfig.builder()
.from(config)
.height(1000)
.build();
MapPlot canvas = new MapPlot(config, p); MapPlot canvas = new MapPlot(config, p);
AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE"); AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE");
@@ -262,7 +264,8 @@ public class TileLookup implements TerrasaurTool {
for (int j = 0; j < config.height(); j++) { for (int j = 0; j < config.height(); j++) {
LatitudinalVector lv = p.pixelToSpherical(i, j); LatitudinalVector lv = p.pixelToSpherical(i, j);
if (lv == null) continue; if (lv == null) continue;
double closestDistance = Math.toDegrees(fs.getNearest(lv).getKey()); double closestDistance =
Math.toDegrees(fs.getNearest(lv).getKey());
// int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size(); // int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size();
image.setRGB( image.setRGB(
config.leftMargin() + i, config.leftMargin() + i,
@@ -278,11 +281,8 @@ public class TileLookup implements TerrasaurTool {
} }
if (p instanceof ProjectionRectangular) { if (p instanceof ProjectionRectangular) {
canvas.drawColorBar( canvas.drawColorBar(ImmutableColorBar.builder()
ImmutableColorBar.builder() .rect(new Rectangle(canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
.rect(
new Rectangle(
canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
.ramp(ramp) .ramp(ramp)
.numTicks(nColors + 1) .numTicks(nColors + 1)
.tickFunction(aDouble -> String.format("%.1f", aDouble)) .tickFunction(aDouble -> String.format("%.1f", aDouble))
@@ -297,7 +297,9 @@ public class TileLookup implements TerrasaurTool {
for (int i = 0; i < fs.getNumTiles(); i++) { for (int i = 0; i < fs.getNumTiles(); i++) {
LatitudinalVector lv = fs.getTileCenter(i); LatitudinalVector lv = fs.getTileCenter(i);
canvas.addAnnotation( canvas.addAnnotation(
ImmutableAnnotation.builder().text(String.format("%d", i)).build(), ImmutableAnnotation.builder()
.text(String.format("%d", i))
.build(),
lv.getLongitude(), lv.getLongitude(),
lv.getLatitude()); lv.getLatitude());
} }
@@ -310,8 +312,7 @@ public class TileLookup implements TerrasaurTool {
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR); BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
Graphics2D g = image.createGraphics(); Graphics2D g = image.createGraphics();
g.setRenderingHint( g.setRenderingHint(RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
int imageWidth = width; int imageWidth = width;
int imageHeight = height / 3; int imageHeight = height / 3;
g.drawImage( g.drawImage(

View File

@@ -52,7 +52,6 @@ import terrasaur.utils.SPICEUtil;
public class TransformFrame implements TerrasaurTool { public class TransformFrame implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger(); private static final Logger logger = LogManager.getLogger();
@Override @Override
public String shortDescription() { public String shortDescription() {
return "Transform coordinates between reference frames."; return "Transform coordinates between reference frames.";
@@ -93,38 +92,52 @@ public class TransformFrame implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(outFile)) { try (PrintWriter pw = new PrintWriter(outFile)) {
for (TDBTime t : pointsOut.keySet()) { for (TDBTime t : pointsOut.keySet()) {
Vector3 v = pointsOut.get(t); Vector3 v = pointsOut.get(t);
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1), pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1), v.getElt(2));
v.getElt(2));
} }
} catch (FileNotFoundException | SpiceException e) { } catch (FileNotFoundException | SpiceException e) {
logger.error(e.getLocalizedMessage()); logger.error(e.getLocalizedMessage());
} }
} }
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("inFile").required().hasArg() options.addOption(Option.builder("inFile")
.required()
.hasArg()
.desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.") .desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.")
.build()); .build());
options.addOption(Option.builder("inFrame").required().hasArg() options.addOption(Option.builder("inFrame")
.desc("Required. Name of inFile reference frame.").build()); .required()
options.addOption(Option.builder("outFile").required().hasArg() .hasArg()
.desc("Required. Name of output file. It will be in the same format as inFile.").build()); .desc("Required. Name of inFile reference frame.")
options.addOption(Option.builder("outFrame").required().hasArg() .build());
.desc("Required. Name of outFile reference frame.").build()); options.addOption(Option.builder("outFile")
options.addOption(Option.builder("spice").required().hasArg().desc( .required()
"Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.") .hasArg()
.desc("Required. Name of output file. It will be in the same format as inFile.")
.build());
options.addOption(Option.builder("outFrame")
.required()
.hasArg()
.desc("Required. Name of outFile reference frame.")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build()); .build());
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
sb.append(String.format("%s ", l.name())); options.addOption(Option.builder("logLevel")
options.addOption(Option.builder("logLevel").hasArg() .hasArg()
.desc("If present, print messages above selected priority. Valid values are " .desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.") + sb.toString().trim() + ". Default is INFO.")
.build()); return options; .build());
return options;
} }
public static void main(String[] args) { public static void main(String[] args) {
@@ -145,14 +158,15 @@ public class TransformFrame implements TerrasaurTool {
List<String> lines = FileUtils.readLines(f, Charset.defaultCharset()); List<String> lines = FileUtils.readLines(f, Charset.defaultCharset());
for (String line : lines) { for (String line : lines) {
String trim = line.trim(); String trim = line.trim();
if (trim.isEmpty() || trim.startsWith("#")) if (trim.isEmpty() || trim.startsWith("#")) continue;
continue;
String[] parts = trim.split(","); String[] parts = trim.split(",");
double et = Double.parseDouble(parts[0].trim()); double et = Double.parseDouble(parts[0].trim());
if (et > 0) { if (et > 0) {
TDBTime t = new TDBTime(et); TDBTime t = new TDBTime(et);
Vector3 v = new Vector3(Double.parseDouble(parts[1].trim()), Vector3 v = new Vector3(
Double.parseDouble(parts[2].trim()), Double.parseDouble(parts[3].trim())); Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()),
Double.parseDouble(parts[3].trim()));
map.put(t, v); map.put(t, v);
} }
} }
@@ -172,5 +186,4 @@ public class TransformFrame implements TerrasaurTool {
tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame")); tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame"));
tf.write(cl.getOptionValue("outFile")); tf.write(cl.getOptionValue("outFile"));
} }
} }

View File

@@ -61,11 +61,14 @@ public class TranslateTime implements TerrasaurTool {
String header = ""; String header = "";
String footer = "\nConvert between different time systems.\n"; String footer = "\nConvert between different time systems.\n";
return TerrasaurTool.super.fullDescription(options, header, footer); return TerrasaurTool.super.fullDescription(options, header, footer);
} }
private enum Types { private enum Types {
JULIAN, SCLK, TDB, TDBCALENDAR, UTC JULIAN,
SCLK,
TDB,
TDBCALENDAR,
UTC
} }
private Map<Integer, SCLK> sclkMap; private Map<Integer, SCLK> sclkMap;
@@ -129,29 +132,39 @@ public class TranslateTime implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile").hasArg() options.addOption(Option.builder("logFile")
.desc("If present, save screen output to log file.").build()); .hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
sb.append(String.format("%s ", l.name())); options.addOption(Option.builder("logLevel")
options.addOption(Option.builder("logLevel").hasArg() .hasArg()
.desc("If present, print messages above selected priority. Valid values are " .desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.") + sb.toString().trim() + ". Default is INFO.")
.build()); .build());
options.addOption(Option.builder("sclk").hasArg().desc( options.addOption(Option.builder("sclk")
"SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.") .hasArg()
.desc("SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("Required. SPICE metakernel containing leap second and SCLK.")
.build()); .build());
options.addOption(Option.builder("spice").required().hasArg()
.desc("Required. SPICE metakernel containing leap second and SCLK.").build());
options.addOption(Option.builder("gui").desc("Launch a GUI.").build()); options.addOption(Option.builder("gui").desc("Launch a GUI.").build());
options.addOption(Option.builder("inputDate").hasArgs().desc("Date to translate.").build()); options.addOption(
Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
sb = new StringBuilder(); sb = new StringBuilder();
for (Types system : Types.values()) { for (Types system : Types.values()) {
sb.append(String.format("%s ", system.name())); sb.append(String.format("%s ", system.name()));
} }
options.addOption(Option.builder("inputSystem").hasArg().desc( options.addOption(Option.builder("inputSystem")
"Timesystem of inputDate. Valid values are " + sb.toString().trim() + ". Default is UTC.") .hasArg()
.build()); return options; .desc("Timesystem of inputDate. Valid values are "
+ sb.toString().trim() + ". Default is UTC.")
.build());
return options;
} }
public static void main(String[] args) throws SpiceException { public static void main(String[] args) throws SpiceException {
@@ -166,13 +179,11 @@ public class TranslateTime implements TerrasaurTool {
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml))); logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// This is to avoid java crashing due to inability to connect to an X display // This is to avoid java crashing due to inability to connect to an X display
if (!cl.hasOption("gui")) if (!cl.hasOption("gui")) System.setProperty("java.awt.headless", "true");
System.setProperty("java.awt.headless", "true");
NativeLibraryLoader.loadSpiceLibraries(); NativeLibraryLoader.loadSpiceLibraries();
for (String kernel : cl.getOptionValues("spice")) for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
KernelDatabase.load(kernel);
LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>(); LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>();
String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*"); String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*");
@@ -185,13 +196,11 @@ public class TranslateTime implements TerrasaurTool {
SCLK sclk = null; SCLK sclk = null;
if (cl.hasOption("sclk")) { if (cl.hasOption("sclk")) {
int sclkID = Integer.parseInt(cl.getOptionValue("sclk")); int sclkID = Integer.parseInt(cl.getOptionValue("sclk"));
if (sclkMap.containsKey(sclkID)) if (sclkMap.containsKey(sclkID)) sclk = sclkMap.get(sclkID);
sclk = sclkMap.get(sclkID);
else { else {
logger.error("Cannot find SCLK {} in kernel pool!", sclkID); logger.error("Cannot find SCLK {} in kernel pool!", sclkID);
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (Integer id : sclkMap.keySet()) for (Integer id : sclkMap.keySet()) sb.append(String.format("%d ", id));
sb.append(String.format("%d ", id));
logger.error("Loaded IDs are {}", sb.toString()); logger.error("Loaded IDs are {}", sb.toString());
} }
} else { } else {
@@ -221,12 +230,11 @@ public class TranslateTime implements TerrasaurTool {
} }
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (String s : cl.getOptionValues("inputDate")) for (String s : cl.getOptionValues("inputDate")) sb.append(String.format("%s ", s));
sb.append(String.format("%s ", s));
String inputDate = sb.toString().trim(); String inputDate = sb.toString().trim();
Types type = Types type = cl.hasOption("inputSystem")
cl.hasOption("inputSystem") ? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase()) ? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
: Types.UTC; : Types.UTC;
switch (type) { switch (type) {
@@ -248,15 +256,19 @@ public class TranslateTime implements TerrasaurTool {
} }
System.out.printf("# input date %s (%s)\n", inputDate, type.name()); System.out.printf("# input date %s (%s)\n", inputDate, type.name());
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n", System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n", sclk.getIDCode());
sclk.getIDCode());
String utcString = tt.toTDB().toUTCString("ISOC", 3); String utcString = tt.toTDB().toUTCString("ISOC", 3);
String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB"); String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB");
String doyString = tt.toTDB().toString("DOY"); String doyString = tt.toTDB().toString("DOY");
System.out.printf("%s, %s, %s, %.6f, %s, %s\n", utcString, tdbString, doyString, System.out.printf(
tt.toTDB().getTDBSeconds(), tt.toJulian(), tt.toSCLK().toString()); "%s, %s, %s, %.6f, %s, %s\n",
utcString,
tdbString,
doyString,
tt.toTDB().getTDBSeconds(),
tt.toJulian(),
tt.toSCLK().toString());
} }
} }

View File

@@ -0,0 +1,159 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.io.*;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.io.FilenameUtils;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.ICQUtils;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataUtil;
import vtk.vtkPolyData;
public class TriAx implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private TriAx() {}
@Override
public String shortDescription() {
return "Generate a triaxial ellipsoid in ICQ format.";
}
@Override
public String fullDescription(Options options) {
String footer =
"\nTriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("A")
.required()
.hasArg()
.desc("Long axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("B")
.required()
.hasArg()
.desc("Medium axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("C")
.required()
.hasArg()
.desc("Short axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("Q")
.required()
.hasArg()
.desc("ICQ size parameter. This is conventionally but not necessarily a power of 2.")
.build());
options.addOption(Option.builder("saveOBJ")
.desc("If present, save in OBJ format as well. "
+ "File will have the same name as ICQ file with an OBJ extension.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.required()
.desc("Name of ICQ file to write.")
.build());
return options;
}
static final int MAX_Q = 512;
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TriAx();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
int q = Integer.parseInt(cl.getOptionValue("Q"));
String shapefile = cl.getOptionValue("output");
double[] ax = new double[3];
ax[0] = Double.parseDouble(cl.getOptionValue("A"));
ax[1] = Double.parseDouble(cl.getOptionValue("B"));
ax[2] = Double.parseDouble(cl.getOptionValue("C"));
double[][][][] vec = new double[3][MAX_Q + 1][MAX_Q + 1][6];
for (int f = 0; f < 6; f++) {
for (int i = 0; i <= q; i++) {
for (int j = 0; j <= q; j++) {
double[] u = ICQUtils.xyf2u(q, i, j, f, ax);
double z = 1
/ Math.sqrt(
Math.pow(u[0] / ax[0], 2) + Math.pow(u[1] / ax[1], 2) + Math.pow(u[2] / ax[2], 2));
double[] v = new Vector3D(u).scalarMultiply(z).toArray();
for (int k = 0; k < 3; k++) {
vec[k][i][j][f] = v[k];
}
}
}
}
ICQUtils.writeICQ(q, vec, shapefile);
if (cl.hasOption("saveOBJ")) {
String basename = FilenameUtils.getBaseName(shapefile);
String dirname = FilenameUtils.getFullPath(shapefile);
if (dirname.isEmpty()) dirname = ".";
File obj = new File(dirname, basename + ".obj");
NativeLibraryLoader.loadVtkLibraries();
try {
vtkPolyData polydata = PolyDataUtil.loadShapeModel(shapefile);
if (polydata == null) {
logger.error("Cannot read {}", shapefile);
System.exit(0);
}
polydata = PolyDataUtil.removeDuplicatePoints(polydata);
polydata = PolyDataUtil.removeUnreferencedPoints(polydata);
polydata = PolyDataUtil.removeZeroAreaFacets(polydata);
PolyDataUtil.saveShapeModelAsOBJ(polydata, obj.getPath());
} catch (Exception e) {
logger.error(e);
}
}
}
}

View File

@@ -30,7 +30,6 @@ import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService; import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors; import java.util.concurrent.Executors;
import java.util.concurrent.Future; import java.util.concurrent.Future;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option; import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;
@@ -58,30 +57,33 @@ public class ValidateNormals implements TerrasaurTool {
@Override @Override
public String fullDescription(Options options) { public String fullDescription(Options options) {
String footer = String footer = "\nThis program checks that the normals of the shape model are not pointing inward.\n";
"\nThis program checks that the normals of the shape model are not pointing inward.\n";
return TerrasaurTool.super.fullDescription(options, "", footer); return TerrasaurTool.super.fullDescription(options, "", footer);
} }
static Options defineOptions() { static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("fast")
Option.builder("origin") .desc("If present, only check for overhangs if center and normal point in opposite "
+ "directions. Default behavior is to always check for intersections between body center "
+ "and facet center.")
.build());
options.addOption(Option.builder("origin")
.hasArg() .hasArg()
.desc( .desc("If present, center of body in xyz coordinates. "
"If present, center of body in xyz coordinates. "
+ "Specify as three floating point values separated by commas. Default is to use the centroid of " + "Specify as three floating point values separated by commas. Default is to use the centroid of "
+ "the input shape model.") + "the input shape model.")
.build()); .build());
options.addOption( options.addOption(Option.builder("obj")
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build()); .required()
options.addOption( .hasArg()
Option.builder("output") .desc("Shape model to validate.")
.build());
options.addOption(Option.builder("output")
.hasArg() .hasArg()
.desc("Write out new OBJ file with corrected vertex orders for facets.") .desc("Write out new OBJ file with corrected vertex orders for facets.")
.build()); .build());
options.addOption( options.addOption(Option.builder("numThreads")
Option.builder("numThreads")
.hasArg() .hasArg()
.desc("Number of threads to run. Default is 1.") .desc("Number of threads to run. Default is 1.")
.build()); .build());
@@ -119,17 +121,18 @@ public class ValidateNormals implements TerrasaurTool {
private class FlippedNormalFinder implements Callable<List<Long>> { private class FlippedNormalFinder implements Callable<List<Long>> {
private static final DateTimeFormatter defaultFormatter = private static final DateTimeFormatter defaultFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
.withLocale(Locale.getDefault()) .withLocale(Locale.getDefault())
.withZone(ZoneId.systemDefault()); .withZone(ZoneId.systemDefault());
private final long index0; private final long index0;
private final long index1; private final long index1;
private final boolean fast;
public FlippedNormalFinder(long index0, long index1) { public FlippedNormalFinder(long index0, long index1, boolean fast) {
this.index0 = index0; this.index0 = index0;
this.index1 = index1; this.index1 = index1;
this.fast = fast;
} }
@Override @Override
@@ -149,8 +152,7 @@ public class ValidateNormals implements TerrasaurTool {
long elapsed = Instant.now().getEpochSecond() - startTime; long elapsed = Instant.now().getEpochSecond() - startTime;
long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime); long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime);
String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish)); String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish));
logger.info( logger.info(String.format(
String.format(
"Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s", "Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s",
Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish)); Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish));
} }
@@ -158,20 +160,21 @@ public class ValidateNormals implements TerrasaurTool {
long index = index0 + i; long index = index0 + i;
CellInfo ci = CellInfo.getCellInfo(polyData, index, idList); CellInfo ci = CellInfo.getCellInfo(polyData, index, idList);
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds); boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
// count up all crossings of the surface between the origin and the facet.
int numCrossings = 0; int numCrossings = 0;
if (isOpposite || !fast) {
// count up all crossings of the surface between the origin and the facet.
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
for (int j = 0; j < cellIds.GetNumberOfIds(); j++) { for (int j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == index) break; if (cellIds.GetId(j) == index) break;
numCrossings++; numCrossings++;
} }
}
// if numCrossings is even, the radial and normal should point in the same direction. If it // if numCrossings is even, the radial and normal should point in the same direction. If it
// is odd, the // is odd, the radial and normal should point in opposite directions.
// radial and normal should point in opposite directions.
boolean shouldBeOpposite = (numCrossings % 2 == 1); boolean shouldBeOpposite = (numCrossings % 2 == 1);
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
// XOR operator - true if both conditions are different // XOR operator - true if both conditions are different
if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index); if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index);
@@ -207,8 +210,7 @@ public class ValidateNormals implements TerrasaurTool {
CommandLine cl = defaultOBJ.parseArgs(args, options); CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl); Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();
@@ -234,8 +236,8 @@ public class ValidateNormals implements TerrasaurTool {
Set<Long> flippedNormals = new HashSet<>(); Set<Long> flippedNormals = new HashSet<>();
int numThreads = boolean fast = cl.hasOption("fast");
cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1; int numThreads = cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) { try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Future<List<Long>>> futures = new ArrayList<>(); List<Future<List<Long>>> futures = new ArrayList<>();
@@ -244,7 +246,7 @@ public class ValidateNormals implements TerrasaurTool {
long fromIndex = i * numFacets; long fromIndex = i * numFacets;
long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets); long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets);
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex); FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex, fast);
futures.add(executor.submit(fnf)); futures.add(executor.submit(fnf));
} }
@@ -253,10 +255,7 @@ public class ValidateNormals implements TerrasaurTool {
executor.shutdown(); executor.shutdown();
} }
logger.info( logger.info("Found {} flipped normals out of {} facets", flippedNormals.size(), polyData.GetNumberOfCells());
"Found {} flipped normals out of {} facets",
flippedNormals.size(),
polyData.GetNumberOfCells());
if (cl.hasOption("output")) { if (cl.hasOption("output")) {
NavigableSet<Long> sorted = new TreeSet<>(flippedNormals); NavigableSet<Long> sorted = new TreeSet<>(flippedNormals);

View File

@@ -114,11 +114,9 @@ public class ValidateOBJ implements TerrasaurTool {
int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5); int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5);
if (meetsCondition) { if (meetsCondition) {
validationMsg = validationMsg =
String.format( String.format("Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
"Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
} else { } else {
validationMsg = validationMsg = String.format(
String.format(
"Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.", "Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.",
facetCount(), 3 * Math.pow(4, n), n); facetCount(), 3 * Math.pow(4, n), n);
} }
@@ -132,13 +130,10 @@ public class ValidateOBJ implements TerrasaurTool {
public boolean testVertices() { public boolean testVertices() {
boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount(); boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount();
if (meetsCondition) if (meetsCondition)
validationMsg = validationMsg = String.format(
String.format( "Model has %d vertices and %d facets. This satisfies v = f/2+2.", vertexCount(), facetCount());
"Model has %d vertices and %d facets. This satisfies v = f/2+2.",
vertexCount(), facetCount());
else else
validationMsg = validationMsg = String.format(
String.format(
"Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.", "Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.",
vertexCount(), facetCount(), facetCount() / 2 + 2); vertexCount(), facetCount(), facetCount() / 2 + 2);
@@ -160,8 +155,7 @@ public class ValidateOBJ implements TerrasaurTool {
polyData.GetPoint(i, iPt); polyData.GetPoint(i, iPt);
List<Long> closestVertices = new ArrayList<>(); List<Long> closestVertices = new ArrayList<>();
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol)) for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol)) if (id > i) closestVertices.add(id);
if (id > i) closestVertices.add(id);
if (!closestVertices.isEmpty()) map.put(i, closestVertices); if (!closestVertices.isEmpty()) map.put(i, closestVertices);
if (map.containsKey(i) && !map.get(i).isEmpty()) { if (map.containsKey(i) && !map.get(i).isEmpty()) {
@@ -254,8 +248,7 @@ public class ValidateOBJ implements TerrasaurTool {
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be // would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices // duplicate vertices
TriangularFacet facet = TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = facet.getArea(); double area = facet.getArea();
if (area == 0) { if (area == 0) {
zeroAreaFacets.add(i); zeroAreaFacets.add(i);
@@ -305,17 +298,15 @@ public class ValidateOBJ implements TerrasaurTool {
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be // would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices // duplicate vertices
TriangularFacet facet = TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
if (facet.getArea() > 0) { if (facet.getArea() > 0) {
stats.addValue(facet.getCenter().getDot(facet.getNormal())); stats.addValue(facet.getCenter().createUnitized().getDot(facet.getNormal()));
cStats.add(facet.getCenter()); cStats.add(facet.getCenter());
nStats.add(facet.getNormal()); nStats.add(facet.getNormal());
} }
} }
validationMsg = validationMsg = String.format(
String.format(
"Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, " "Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, "
+ "angle between mean radial and mean normal is %f degrees", + "angle between mean radial and mean normal is %f degrees",
stats.getN(), stats.getN(),
@@ -327,40 +318,38 @@ public class ValidateOBJ implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("obj")
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build()); .required()
options.addOption( .hasArg()
Option.builder("logFile") .desc("Shape model to validate.")
.build());
options.addOption(Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
options.addOption(Option.builder("output").hasArg().desc("Filename for output OBJ.").build()); options.addOption(Option.builder("output")
options.addOption( .hasArg()
Option.builder("removeDuplicateVertices") .desc("Filename for output OBJ.")
.build());
options.addOption(Option.builder("removeDuplicateVertices")
.desc("Remove duplicate vertices. Use with -output to save OBJ.") .desc("Remove duplicate vertices. Use with -output to save OBJ.")
.build()); .build());
options.addOption( options.addOption(Option.builder("removeUnreferencedVertices")
Option.builder("removeUnreferencedVertices")
.desc("Remove unreferenced vertices. Use with -output to save OBJ.") .desc("Remove unreferenced vertices. Use with -output to save OBJ.")
.build()); .build());
options.addOption( options.addOption(Option.builder("removeZeroAreaFacets")
Option.builder("removeZeroAreaFacets")
.desc("Remove facets with zero area. Use with -output to save OBJ.") .desc("Remove facets with zero area. Use with -output to save OBJ.")
.build()); .build());
options.addOption( options.addOption(Option.builder("cleanup")
Option.builder("cleanup") .desc("Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
.desc(
"Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
.build()); .build());
return options; return options;
} }
@@ -373,8 +362,7 @@ public class ValidateOBJ implements TerrasaurTool {
CommandLine cl = defaultOBJ.parseArgs(args, options); CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl); Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
NativeLibraryLoader.loadVtkLibraries(); NativeLibraryLoader.loadVtkLibraries();
vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj")); vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
@@ -406,8 +394,7 @@ public class ValidateOBJ implements TerrasaurTool {
final boolean cleanup = cl.hasOption("cleanup"); final boolean cleanup = cl.hasOption("cleanup");
final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices"); final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices");
final boolean removeUnreferencedVertices = final boolean removeUnreferencedVertices = cleanup || cl.hasOption("removeUnreferencedVertices");
cleanup || cl.hasOption("removeUnreferencedVertices");
final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets"); final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets");
if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData); if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData);
@@ -418,7 +405,7 @@ public class ValidateOBJ implements TerrasaurTool {
if (cl.hasOption("output")) { if (cl.hasOption("output")) {
PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output")); PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output"));
logger.info(String.format("Wrote OBJ file %s", cl.getOptionValue("output"))); logger.info("Wrote OBJ file {}", cl.getOptionValue("output"));
} }
} }
} }

View File

@@ -22,15 +22,16 @@
*/ */
package terrasaur.config; package terrasaur.config;
import java.util.List;
import jackfruit.annotations.Comment; import jackfruit.annotations.Comment;
import jackfruit.annotations.DefaultValue; import jackfruit.annotations.DefaultValue;
import jackfruit.annotations.Jackfruit; import jackfruit.annotations.Jackfruit;
import java.util.List;
@Jackfruit @Jackfruit
public interface CKFromSumFileConfig { public interface CKFromSumFileConfig {
@Comment(""" @Comment(
"""
Body fixed frame for the target body. If blank, use SPICE-defined Body fixed frame for the target body. If blank, use SPICE-defined
body fixed frame. This will be the reference frame unless the J2000 body fixed frame. This will be the reference frame unless the J2000
parameter is set to true.""") parameter is set to true.""")
@@ -41,14 +42,16 @@ public interface CKFromSumFileConfig {
@DefaultValue("DIMORPHOS") @DefaultValue("DIMORPHOS")
String bodyName(); String bodyName();
@Comment(""" @Comment(
"""
Extend CK past the last sumFile by this number of seconds. Default Extend CK past the last sumFile by this number of seconds. Default
is zero. Attitude is assumed to be fixed to the value given by the is zero. Attitude is assumed to be fixed to the value given by the
last sumfile.""") last sumfile.""")
@DefaultValue("0") @DefaultValue("0")
double extend(); double extend();
@Comment(""" @Comment(
"""
SPC defines the camera X axis to be increasing to the right, Y to SPC defines the camera X axis to be increasing to the right, Y to
be increasing down, and Z to point into the page: be increasing down, and Z to point into the page:
@@ -85,7 +88,8 @@ public interface CKFromSumFileConfig {
@DefaultValue("-3") @DefaultValue("-3")
int flipZ(); int flipZ();
@Comment(""" @Comment(
"""
Supply this frame kernel to MSOPCK. Only needed if the reference frame Supply this frame kernel to MSOPCK. Only needed if the reference frame
(set by bodyFrame or J2000) is not built into SPICE""") (set by bodyFrame or J2000) is not built into SPICE""")
@DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf") @DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf")
@@ -111,11 +115,10 @@ public interface CKFromSumFileConfig {
@DefaultValue("DART_SPACECRAFT") @DefaultValue("DART_SPACECRAFT")
String spacecraftFrame(); String spacecraftFrame();
@Comment(""" @Comment(
"""
SPICE metakernel to read. This may be specified more than once SPICE metakernel to read. This may be specified more than once
for multiple metakernels.""") for multiple metakernels.""")
@DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm") @DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm")
List<String> metakernel(); List<String> metakernel();
} }

View File

@@ -29,8 +29,8 @@ import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.spi.StandardLevel; import org.apache.logging.log4j.spi.StandardLevel;
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
import terrasaur.utils.Log4j2Configurator; import terrasaur.utils.Log4j2Configurator;
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
public class CommandLineOptions { public class CommandLineOptions {
@@ -42,7 +42,11 @@ public class CommandLineOptions {
* @return * @return
*/ */
public static Option addConfig() { public static Option addConfig() {
return Option.builder("config").hasArg().required().desc("Configuration file to load").build(); return Option.builder("config")
.hasArg()
.required()
.desc("Configuration file to load")
.build();
} }
/** /**
@@ -56,8 +60,7 @@ public class CommandLineOptions {
for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name())); for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name()));
return Option.builder("colorRamp") return Option.builder("colorRamp")
.hasArg() .hasArg()
.desc( .desc("Color ramp style. Valid values are "
"Color ramp style. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is " + ". Default is "
+ defaultCRType.name() + defaultCRType.name()
@@ -73,9 +76,9 @@ public class CommandLineOptions {
* @return * @return
*/ */
public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) { public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) {
ColorRamp.TYPE crType = ColorRamp.TYPE crType = cl.hasOption("colorRamp")
cl.hasOption("colorRamp") ? ColorRamp.TYPE.valueOf(
? ColorRamp.TYPE.valueOf(cl.getOptionValue("colorRamp").toUpperCase().strip()) cl.getOptionValue("colorRamp").toUpperCase().strip())
: defaultCRType; : defaultCRType;
return crType; return crType;
} }
@@ -150,8 +153,7 @@ public class CommandLineOptions {
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name())); for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
return Option.builder("logLevel") return Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build(); .build();
@@ -165,7 +167,8 @@ public class CommandLineOptions {
public static void setLogLevel(CommandLine cl) { public static void setLogLevel(CommandLine cl) {
Log4j2Configurator lc = Log4j2Configurator.getInstance(); Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel")) if (cl.hasOption("logLevel"))
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim())); lc.setLevel(
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
} }
/** /**

View File

@@ -25,7 +25,6 @@ package terrasaur.config;
import jackfruit.annotations.Comment; import jackfruit.annotations.Comment;
import jackfruit.annotations.DefaultValue; import jackfruit.annotations.DefaultValue;
import jackfruit.annotations.Jackfruit; import jackfruit.annotations.Jackfruit;
import java.util.List; import java.util.List;
@Jackfruit(prefix = "mission") @Jackfruit(prefix = "mission")

View File

@@ -36,7 +36,9 @@ public interface SPCBlock {
############################################################################### ###############################################################################
"""; """;
@Comment(introLines + """ @Comment(
introLines
+ """
SPC defines the camera X axis to be increasing to the right, Y to SPC defines the camera X axis to be increasing to the right, Y to
be increasing down, and Z to point into the page: be increasing down, and Z to point into the page:
@@ -72,5 +74,4 @@ public interface SPCBlock {
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.") @Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
@DefaultValue("-3") @DefaultValue("-3")
int flipZ(); int flipZ();
} }

View File

@@ -92,14 +92,12 @@ public class TerrasaurConfig {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock); PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock);
PropertiesConfigurationLayout layout = config.getLayout(); PropertiesConfigurationLayout layout = config.getLayout();
String now = String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault()) .withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC) .withZone(ZoneOffset.UTC)
.format(Instant.now()); .format(Instant.now());
layout.setHeaderComment( layout.setHeaderComment(
String.format( String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw); config.write(pw);
} catch (ConfigurationException | IOException e) { } catch (ConfigurationException | IOException e) {
@@ -107,5 +105,4 @@ public class TerrasaurConfig {
} }
return string.toString(); return string.toString();
} }
} }

View File

@@ -92,10 +92,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
* DTM fits. Includes fits files at various intermediate stages such as the initial fits file from * DTM fits. Includes fits files at various intermediate stages such as the initial fits file from
* Maplet2Fits or CreateMapola, the fits files that contain gravity planes output from * Maplet2Fits or CreateMapola, the fits files that contain gravity planes output from
@@ -126,7 +124,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// OBJ shape model as FITS file // OBJ shape model as FITS file
@@ -153,7 +150,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// SIGMA. Used in calculation of some of the ALTWG products. // SIGMA. Used in calculation of some of the ALTWG products.
@@ -178,7 +174,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
VERTEX_ERROR("Sigma of vertex vector", "vrt", "km", "Sigma", null, "Vertex Radius") { VERTEX_ERROR("Sigma of vertex vector", "vrt", "km", "Sigma", null, "Vertex Radius") {
@@ -203,8 +198,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
@@ -238,7 +231,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return AltwgDataType.NORMALX_UNCERTAINTY; return AltwgDataType.NORMALX_UNCERTAINTY;
} }
}, },
/* /*
@@ -266,7 +258,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NORMALY_UNCERTAINTY; return NORMALY_UNCERTAINTY;
} }
}, },
/* /*
@@ -294,7 +285,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NORMALZ_UNCERTAINTY; return NORMALZ_UNCERTAINTY;
} }
}, },
GRAVITY_VECTOR_X("gravity vector", "grv", "m/s^2", "Gravity vector X", null, null) { GRAVITY_VECTOR_X("gravity vector", "grv", "m/s^2", "Gravity vector X", null, null) {
@@ -317,7 +307,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return GRAVITY_VECTOR_X_UNCERTAINTY; return GRAVITY_VECTOR_X_UNCERTAINTY;
} }
}, },
/* /*
@@ -345,7 +334,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return AltwgDataType.GRAVITY_VECTOR_Y_UNCERTAINTY; return AltwgDataType.GRAVITY_VECTOR_Y_UNCERTAINTY;
} }
}, },
/* /*
@@ -373,11 +361,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return GRAVITY_VECTOR_Z_UNCERTAINTY; return GRAVITY_VECTOR_Z_UNCERTAINTY;
} }
}, },
GRAVITATIONAL_MAGNITUDE("gravitational magnitude", "grm", "m/s^2", "Gravitational magnitude", GRAVITATIONAL_MAGNITUDE("gravitational magnitude", "grm", "m/s^2", "Gravitational magnitude", null, null) {
null, null) {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 10; return 10;
@@ -397,11 +383,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return AltwgDataType.GRAVITATIONAL_MAGNITUDE_UNCERTAINTY; return AltwgDataType.GRAVITATIONAL_MAGNITUDE_UNCERTAINTY;
} }
}, },
GRAVITATIONAL_POTENTIAL("gravitational potential", "pot", "J/kg", "Gravitational potential", null, GRAVITATIONAL_POTENTIAL("gravitational potential", "pot", "J/kg", "Gravitational potential", null, null) {
null) {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 11; return 11;
@@ -421,7 +405,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return AltwgDataType.GRAVITATIONAL_POTENTIAL_UNCERTAINTY; return AltwgDataType.GRAVITATIONAL_POTENTIAL_UNCERTAINTY;
} }
}, },
ELEVATION("elevation", "elv", "meters", "Elevation", null, null) { ELEVATION("elevation", "elv", "meters", "Elevation", null, null) {
@@ -444,7 +427,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return EL_UNCERTAINTY; return EL_UNCERTAINTY;
} }
}, },
// No longer needed! This is same as HEIGHT plane! // No longer needed! This is same as HEIGHT plane!
@@ -486,7 +468,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_UNCERTAINTY; return TILT_UNCERTAINTY;
} }
}, },
FACET_MAP("Facet Map", "ffi", null, "Facet Coarse Index", null, "Facet Coarse Index") { FACET_MAP("Facet Map", "ffi", null, "Facet Coarse Index", null, "Facet Coarse Index") {
@@ -510,7 +491,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
FACET_TILT("Facet Tilt", "fti", "degrees", "Facet tilt", null, "Facet Tilt") { FACET_TILT("Facet Tilt", "fti", "degrees", "Facet tilt", null, "Facet Tilt") {
@@ -533,11 +513,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_UNCERTAINTY; return TILT_UNCERTAINTY;
} }
}, },
FACET_TILT_DIRECTION("Facet Tilt Direction", "fdi", "degrees", "Facet tilt direction", null, FACET_TILT_DIRECTION(
"Facet Tilt Direction") { "Facet Tilt Direction", "fdi", "degrees", "Facet tilt direction", null, "Facet Tilt Direction") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 35; return 35;
@@ -557,7 +536,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_UNCERTAINTY; return TILT_UNCERTAINTY;
} }
}, },
TILT_AVG("Mean Tilt", "mti", "degrees", "Mean tilt", null, "Mean Tilt") { TILT_AVG("Mean Tilt", "mti", "degrees", "Mean tilt", null, "Mean Tilt") {
@@ -580,8 +558,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_AVG_UNCERTAINTY; return TILT_AVG_UNCERTAINTY;
} }
}, },
TILT_VARIATION("Tilt Variation", "tiv", "degrees", "Tilt variation", null, "Tilt Variation") { TILT_VARIATION("Tilt Variation", "tiv", "degrees", "Tilt variation", null, "Tilt Variation") {
@@ -604,11 +580,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_RELATIVE_UNCERTAINTY; return TILT_RELATIVE_UNCERTAINTY;
} }
}, },
TILT_AVG_DIRECTION("Mean Tilt Direction", "mdi", "degrees", "Mean tilt direction", null, TILT_AVG_DIRECTION("Mean Tilt Direction", "mdi", "degrees", "Mean tilt direction", null, "Mean Tilt Direction") {
"Mean Tilt Direction") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 37; return 37;
@@ -628,11 +602,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_AVG_DIRECTION_UNCERTAINTY; return TILT_AVG_DIRECTION_UNCERTAINTY;
} }
}, },
TILT_DIRECTION_VARIATION("Tilt Direction Variation", "div", "degrees", "Tilt direction variation", TILT_DIRECTION_VARIATION(
null, "Tilt Direction Variation") { "Tilt Direction Variation",
"div",
"degrees",
"Tilt direction variation",
null,
"Tilt Direction Variation") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 39; return 39;
@@ -652,7 +630,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_RELATIVE_DIRECTION_UNCERTAINTY; return TILT_RELATIVE_DIRECTION_UNCERTAINTY;
} }
}, },
TILT_RELATIVE("Relative Tilt", "rti", "degrees", "Relative tilt", null, "Relative Tilt") { TILT_RELATIVE("Relative Tilt", "rti", "degrees", "Relative tilt", null, "Relative Tilt") {
@@ -675,11 +652,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_RELATIVE_UNCERTAINTY; return TILT_RELATIVE_UNCERTAINTY;
} }
}, },
TILT_RELATIVE_DIRECTION("Relative Tilt Direction", "rdi", "degrees", "Relative tilt direction", TILT_RELATIVE_DIRECTION(
null, "Relative Tilt Direction") { "Relative Tilt Direction", "rdi", "degrees", "Relative tilt direction", null, "Relative Tilt Direction") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 43; return 43;
@@ -699,7 +675,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return TILT_RELATIVE_DIRECTION_UNCERTAINTY; return TILT_RELATIVE_DIRECTION_UNCERTAINTY;
} }
}, },
/* /*
@@ -711,8 +686,8 @@ public enum AltwgDataType {
* getAnciColName() returns the 'description' string * getAnciColName() returns the 'description' string
*/ */
RELATIVE_HEIGHT("Max height/depth within an ellipse", "mht", "km", "Max relative height", null, RELATIVE_HEIGHT(
"Max Relative Height") { "Max height/depth within an ellipse", "mht", "km", "Max relative height", null, "Max Relative Height") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return 49; return 49;
@@ -732,7 +707,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return RELATIVE_HEIGHT_UNCERTAINTY; return RELATIVE_HEIGHT_UNCERTAINTY;
} }
}, },
FACETRAD("Facet Radius", "rad", "km", "Facet radius", null, "Facet_Center_Radius") { FACETRAD("Facet Radius", "rad", "km", "Facet radius", null, "Facet_Center_Radius") {
@@ -755,7 +729,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return FACETRAD_UNCERTAINTY; return FACETRAD_UNCERTAINTY;
} }
}, },
/* /*
@@ -763,8 +736,7 @@ public enum AltwgDataType {
* fileFrag, String units, String headerValue, String comment, String anciColName) * fileFrag, String units, String headerValue, String comment, String anciColName)
*/ */
TPLATEANCI("Ancillary Template Scalar", "tes", "TBDCOLUNITS", "AncillaryTemplate", null, TPLATEANCI("Ancillary Template Scalar", "tes", "TBDCOLUNITS", "AncillaryTemplate", null, "TBDCOLNAME") {
"TBDCOLNAME") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return -1; return -1;
@@ -785,12 +757,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// ancillary template, scalar // ancillary template, scalar
TPLATEANCIVEC("Ancillary Template Vector", "tev", "TBD", "AncillaryTemplate", null, TPLATEANCIVEC("Ancillary Template Vector", "tev", "TBD", "AncillaryTemplate", null, "TBDCOLNAME") {
"TBDCOLNAME") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return -1; return -1;
@@ -811,7 +781,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
AREA("Facet Area", "are", "km^2", "Facet Area", null, "Facet Area") { AREA("Facet Area", "are", "km^2", "Facet Area", null, "Facet Area") {
@@ -835,7 +804,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return AREA_UNCERTAINTY; return AREA_UNCERTAINTY;
} }
}, },
/* /*
@@ -846,7 +814,12 @@ public enum AltwgDataType {
* fits header anciColName - string to use when defining the column name in the ancillary fits * fits header anciColName - string to use when defining the column name in the ancillary fits
* file. If null then getAnciColName() returns the 'description' string * file. If null then getAnciColName() returns the 'description' string
*/ */
AREA_UNCERTAINTY("FacetArea Uncertainty", "notapplicable", "km^2", "facet area uncertainty", null, AREA_UNCERTAINTY(
"FacetArea Uncertainty",
"notapplicable",
"km^2",
"facet area uncertainty",
null,
"facet area uncertainty") { "facet area uncertainty") {
@Override @Override
@@ -868,12 +841,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
EL_UNCERTAINTY("Elevation Uncertainty", "notapplicable", "m", "El uncertainty", null, "elevation uncertainty") {
EL_UNCERTAINTY("Elevation Uncertainty", "notapplicable", "m", "El uncertainty", null,
"elevation uncertainty") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
@@ -894,7 +864,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
@@ -907,8 +876,8 @@ public enum AltwgDataType {
*/ */
// Uncertainty in gravity vector, x component // Uncertainty in gravity vector, x component
GRAVITY_VECTOR_X_UNCERTAINTY("Grav_X Uncertainty", "notapplicable", "m/s^2", "GravX uncertainty", GRAVITY_VECTOR_X_UNCERTAINTY(
null, "vector component uncertainty") { "Grav_X Uncertainty", "notapplicable", "m/s^2", "GravX uncertainty", null, "vector component uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -932,12 +901,11 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in gravity vector, y component // Uncertainty in gravity vector, y component
GRAVITY_VECTOR_Y_UNCERTAINTY("Grav_Y Uncertainty", "notapplicable", "m/s^2", "GravY uncertainty", GRAVITY_VECTOR_Y_UNCERTAINTY(
null, "vector component uncertainty") { "Grav_Y Uncertainty", "notapplicable", "m/s^2", "GravY uncertainty", null, "vector component uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -961,12 +929,11 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in gravity vector, z component // Uncertainty in gravity vector, z component
GRAVITY_VECTOR_Z_UNCERTAINTY("Grav_Z Uncertainty", "notapplicable", "m/s^2", "GravZ uncertainty", GRAVITY_VECTOR_Z_UNCERTAINTY(
null, "vector component uncertainty") { "Grav_Z Uncertainty", "notapplicable", "m/s^2", "GravZ uncertainty", null, "vector component uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -990,12 +957,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in gravitational magnitude // Uncertainty in gravitational magnitude
GRAVITATIONAL_MAGNITUDE_UNCERTAINTY("Grav Mag Uncertainty", "notapplicable", "m/s^2", GRAVITATIONAL_MAGNITUDE_UNCERTAINTY(
"GravMag uncertainty", null, "grav magnitude uncertainty") { "Grav Mag Uncertainty",
"notapplicable",
"m/s^2",
"GravMag uncertainty",
null,
"grav magnitude uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1019,12 +990,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in gravitational potential // Uncertainty in gravitational potential
GRAVITATIONAL_POTENTIAL_UNCERTAINTY("Grav Pot Uncertainty", "notapplicable", "m/s^2", GRAVITATIONAL_POTENTIAL_UNCERTAINTY(
"GravPot Uncertainty", null, "grav potential uncertainty") { "Grav Pot Uncertainty",
"notapplicable",
"m/s^2",
"GravPot Uncertainty",
null,
"grav potential uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1048,12 +1023,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in Normal Vector, X component // Uncertainty in Normal Vector, X component
NORMALX_UNCERTAINTY("Normal X Uncertainty", "notapplicable", null, "Normal X Uncertainty", null, NORMALX_UNCERTAINTY(
"Normal X Uncertainty",
"notapplicable",
null,
"Normal X Uncertainty",
null,
"vector component uncertainty") { "vector component uncertainty") {
@Override @Override
@@ -1078,11 +1056,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in Normal Vector, Y component // Uncertainty in Normal Vector, Y component
NORMALY_UNCERTAINTY("Normal Y Uncertainty", "notapplicable", null, "Normal Y Uncertainty", null, NORMALY_UNCERTAINTY(
"Normal Y Uncertainty",
"notapplicable",
null,
"Normal Y Uncertainty",
null,
"vector component uncertainty") { "vector component uncertainty") {
@Override @Override
@@ -1107,11 +1089,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// Uncertainty in Normal Vector, Z component // Uncertainty in Normal Vector, Z component
NORMALZ_UNCERTAINTY("Normal Z Uncertainty", "notapplicable", null, "Normal Z Uncertainty", null, NORMALZ_UNCERTAINTY(
"Normal Z Uncertainty",
"notapplicable",
null,
"Normal Z Uncertainty",
null,
"vector component uncertainty") { "vector component uncertainty") {
@Override @Override
@@ -1136,7 +1122,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// tilt uncertainty // tilt uncertainty
@@ -1147,8 +1132,7 @@ public enum AltwgDataType {
* anciColName - string to use when defining the column name in the ancillary fits file. If null * anciColName - string to use when defining the column name in the ancillary fits file. If null
* then getAnciColName() returns the 'description' string * then getAnciColName() returns the 'description' string
*/ */
TILT_UNCERTAINTY("Tilt Uncertainty", "notapplicable", "degrees", "Tilt Uncertainty", null, TILT_UNCERTAINTY("Tilt Uncertainty", "notapplicable", "degrees", "Tilt Uncertainty", null, "tilt uncertainty") {
"tilt uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1172,15 +1156,19 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
* tilt variation standard error. Used to populate the SIGMA column for the tilt variation * tilt variation standard error. Used to populate the SIGMA column for the tilt variation
* ancillary product. * ancillary product.
*/ */
TILTVAR_UNCERTAINTY("Tilt Variation Uncertainty", "notapplicable", "degrees", TILTVAR_UNCERTAINTY(
"Tilt Variation Uncertainty", null, "variation uncertainty") { "Tilt Variation Uncertainty",
"notapplicable",
"degrees",
"Tilt Variation Uncertainty",
null,
"variation uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1204,15 +1192,19 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
* tilt variation standard error. Used to populate the SIGMA column for the tilt variation * tilt variation standard error. Used to populate the SIGMA column for the tilt variation
* ancillary product. * ancillary product.
*/ */
TILTDIRVAR_UNCERTAINTY("Tilt Direction Variation Uncertainty", "notapplicable", "degrees", TILTDIRVAR_UNCERTAINTY(
"Tilt Direction Variation Uncertainty", null, "direction variation uncertainty") { "Tilt Direction Variation Uncertainty",
"notapplicable",
"degrees",
"Tilt Direction Variation Uncertainty",
null,
"direction variation uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1236,10 +1228,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
* albedo from SPC description - general description of the product type fileFrag - string * albedo from SPC description - general description of the product type fileFrag - string
* fragment to use in ALTWG file naming convention units - units of the data values headerValue - * fragment to use in ALTWG file naming convention units - units of the data values headerValue -
@@ -1269,11 +1259,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return ALBEDO_UNCERTAINTY; return ALBEDO_UNCERTAINTY;
} }
}, },
ALBEDO_UNCERTAINTY("Albedo Uncertainty", "notapplicable", "unitless", "Albedo uncertainty", null, ALBEDO_UNCERTAINTY(
"albedo uncertainty") { "Albedo Uncertainty", "notapplicable", "unitless", "Albedo uncertainty", null, "albedo uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1297,7 +1286,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// OLA intensity. Used in place of albedo // OLA intensity. Used in place of albedo
@@ -1323,10 +1311,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return INTENSITY_UNCERTAINTY; return INTENSITY_UNCERTAINTY;
} }
}, },
// OLA intensity. Used in place of albedo // OLA intensity. Used in place of albedo
INTENSITY_UNCERTAINTY("Intensity", "alb", "unitless", "Intensity", null, "intensity") { INTENSITY_UNCERTAINTY("Intensity", "alb", "unitless", "Intensity", null, "intensity") {
@@ -1350,10 +1336,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
/* /*
* Each enumeration has the following properties in the order shown: description - general * Each enumeration has the following properties in the order shown: description - general
* description of the product type fileFrag - string fragment to use in ALTWG file naming * description of the product type fileFrag - string fragment to use in ALTWG file naming
@@ -1363,8 +1347,13 @@ public enum AltwgDataType {
* getAnciColName() returns the 'description' string * getAnciColName() returns the 'description' string
*/ */
RELATIVE_HEIGHT_UNCERTAINTY("Max relative height uncertainty", "notapplicable", "km", RELATIVE_HEIGHT_UNCERTAINTY(
"Max relative height uncertainty", null, "max relative height uncertainty") { "Max relative height uncertainty",
"notapplicable",
"km",
"Max relative height uncertainty",
null,
"max relative height uncertainty") {
@Override @Override
public int getGlobalSpocID() { public int getGlobalSpocID() {
return -1; return -1;
@@ -1384,12 +1373,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// facet radius uncertainty // facet radius uncertainty
FACETRAD_UNCERTAINTY("Facet Radius Uncertainty", "notapplicable", null, FACETRAD_UNCERTAINTY(
"Facet Radius Uncertainty", null, "facet radius uncertainty") { "Facet Radius Uncertainty",
"notapplicable",
null,
"Facet Radius Uncertainty",
null,
"facet radius uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1413,13 +1406,11 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// uncertainty in mean tilt (TILT_AVG) // uncertainty in mean tilt (TILT_AVG)
TILT_AVG_UNCERTAINTY("Mean Tilt Uncertainty", "notapplicable", null, "Mean Tilt Uncertainty", TILT_AVG_UNCERTAINTY(
null, "mean tilt uncertainty") { "Mean Tilt Uncertainty", "notapplicable", null, "Mean Tilt Uncertainty", null, "mean tilt uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1443,13 +1434,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// uncertainty in mean tilt direction (TILT_AVG_DIRECTION) // uncertainty in mean tilt direction (TILT_AVG_DIRECTION)
TILT_AVG_DIRECTION_UNCERTAINTY("Mean Tilt Direction Uncertainty", "notapplicable", null, TILT_AVG_DIRECTION_UNCERTAINTY(
"Mean Tilt Direction Uncertainty", null, "mean tilt direction uncertainty") { "Mean Tilt Direction Uncertainty",
"notapplicable",
null,
"Mean Tilt Direction Uncertainty",
null,
"mean tilt direction uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1473,13 +1467,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}, },
// uncertainty in relative tilt (TILT_RELATIVE) // uncertainty in relative tilt (TILT_RELATIVE)
TILT_RELATIVE_UNCERTAINTY("Relative Tilt Uncertainty", "notapplicable", null, TILT_RELATIVE_UNCERTAINTY(
"Relative Tilt Uncertainty", null, "relative tilt uncertainty") { "Relative Tilt Uncertainty",
"notapplicable",
null,
"Relative Tilt Uncertainty",
null,
"relative tilt uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1503,42 +1500,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
},
// uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION)
TILT_RELATIVE_DIRECTION_UNCERTAINTY("Relative Tilt Direction Uncertainty", "notapplicable", null,
"Relative Tilt Direction Uncertainty", null, "relative tilt direction uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getGlobalSpocID() {
return -1;
}
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getLocalSpocID() {
return -1;
}
// no such plane in DTM
@Override
public PlaneInfo getPlaneInfo() {
return null;
}
@Override
public AltwgDataType getSigmaType() {
return NA;
}
}, },
// uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION) // uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION)
TILT_DIRECTION_UNCERTAINTY("Tilt Direction Uncertainty", "notapplicable", null, TILT_RELATIVE_DIRECTION_UNCERTAINTY(
"Tilt Direction Uncertainty", null, "tilt direction uncertainty") { "Relative Tilt Direction Uncertainty",
"notapplicable",
null,
"Relative Tilt Direction Uncertainty",
null,
"relative tilt direction uncertainty") {
@Override @Override
// not a SPOC product, just the SIGMA column in the ancillary fits file // not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1562,7 +1533,39 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
},
// uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION)
TILT_DIRECTION_UNCERTAINTY(
"Tilt Direction Uncertainty",
"notapplicable",
null,
"Tilt Direction Uncertainty",
null,
"tilt direction uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getGlobalSpocID() {
return -1;
}
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getLocalSpocID() {
return -1;
}
// no such plane in DTM
@Override
public PlaneInfo getPlaneInfo() {
return null;
}
@Override
public AltwgDataType getSigmaType() {
return NA;
}
}, },
/* /*
@@ -1590,10 +1593,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() { public AltwgDataType getSigmaType() {
return NA; return NA;
} }
}; };
private String description; // description of enumeration private String description; // description of enumeration
private String fileFrag; private String fileFrag;
private String units; private String units;
@@ -1601,8 +1602,8 @@ public enum AltwgDataType {
private String comment; // comment to add to fits header (optional) private String comment; // comment to add to fits header (optional)
private String anciColName; private String anciColName;
AltwgDataType(String description, String fileFrag, String units, String headerValue, AltwgDataType(
String comment, String anciColName) { String description, String fileFrag, String units, String headerValue, String comment, String anciColName) {
this.description = description; this.description = description;
this.fileFrag = fileFrag; this.fileFrag = fileFrag;
this.units = units; this.units = units;
@@ -1636,10 +1637,8 @@ public enum AltwgDataType {
} }
public String getHeaderValueWithUnits() { public String getHeaderValueWithUnits() {
if (units == null) if (units == null) return headerValue;
return headerValue; else return headerValue + " (" + units + ")";
else
return headerValue + " (" + units + ")";
} }
public String desc() { public String desc() {
@@ -1665,9 +1664,11 @@ public enum AltwgDataType {
* AltwgProductTypes. * AltwgProductTypes.
* *
*/ */
public static final EnumSet<AltwgDataType> anciGravitySet = public static final EnumSet<AltwgDataType> anciGravitySet = EnumSet.of(
EnumSet.of(AltwgDataType.GRAVITY_VECTOR_X, AltwgDataType.GRAVITATIONAL_MAGNITUDE, AltwgDataType.GRAVITY_VECTOR_X,
AltwgDataType.GRAVITATIONAL_POTENTIAL, AltwgDataType.ELEVATION); AltwgDataType.GRAVITATIONAL_MAGNITUDE,
AltwgDataType.GRAVITATIONAL_POTENTIAL,
AltwgDataType.ELEVATION);
// slope is no longer in anciGravitySet, because it depends on errors that are determined // slope is no longer in anciGravitySet, because it depends on errors that are determined
// in Shape2Tilt. See MultiTableToAncillaryFits.java // in Shape2Tilt. See MultiTableToAncillaryFits.java
@@ -1695,17 +1696,22 @@ public enum AltwgDataType {
// EnumSet.range(AltwgDataType.NORMAL_VECTOR_X, // EnumSet.range(AltwgDataType.NORMAL_VECTOR_X,
// AltwgDataType.SLOPE); // AltwgDataType.SLOPE);
public static final EnumSet<AltwgDataType> gravityPlanes = public static final EnumSet<AltwgDataType> gravityPlanes = EnumSet.of(
EnumSet.of(AltwgDataType.NORMAL_VECTOR_X, AltwgDataType.NORMAL_VECTOR_Y, AltwgDataType.NORMAL_VECTOR_X,
AltwgDataType.NORMAL_VECTOR_Z, AltwgDataType.GRAVITY_VECTOR_X, AltwgDataType.NORMAL_VECTOR_Y,
AltwgDataType.GRAVITY_VECTOR_Y, AltwgDataType.GRAVITY_VECTOR_Z, AltwgDataType.NORMAL_VECTOR_Z,
AltwgDataType.GRAVITATIONAL_MAGNITUDE, AltwgDataType.GRAVITATIONAL_POTENTIAL, AltwgDataType.GRAVITY_VECTOR_X,
AltwgDataType.ELEVATION, AltwgDataType.SLOPE, AltwgDataType.AREA); AltwgDataType.GRAVITY_VECTOR_Y,
AltwgDataType.GRAVITY_VECTOR_Z,
AltwgDataType.GRAVITATIONAL_MAGNITUDE,
AltwgDataType.GRAVITATIONAL_POTENTIAL,
AltwgDataType.ELEVATION,
AltwgDataType.SLOPE,
AltwgDataType.AREA);
public static final EnumSet<AltwgDataType> tiltPlanes = public static final EnumSet<AltwgDataType> tiltPlanes =
EnumSet.range(AltwgDataType.FACET_TILT, AltwgDataType.RELATIVE_HEIGHT); EnumSet.range(AltwgDataType.FACET_TILT, AltwgDataType.RELATIVE_HEIGHT);
/** /**
* Unique identifier for ALTWG global product in OSIRIS-REx SPOC system. * Unique identifier for ALTWG global product in OSIRIS-REx SPOC system.
* *

View File

@@ -25,9 +25,16 @@ package terrasaur.enums;
import org.apache.commons.io.FilenameUtils; import org.apache.commons.io.FilenameUtils;
public enum FORMATS { public enum FORMATS {
ASCII(true),
ASCII(true), BIN3(true), BIN4(true), BIN7(true), FITS(false), ICQ(false), OBJ(false), PLT( BIN3(true),
false), PLY(false), VTK(false); BIN4(true),
BIN7(true),
FITS(false),
ICQ(false),
OBJ(false),
PLT(false),
PLY(false),
VTK(false);
/** True if this format contains no facet information */ /** True if this format contains no facet information */
public boolean pointsOnly; public boolean pointsOnly;
@@ -78,5 +85,4 @@ public enum FORMATS {
return null; return null;
} }
} }

View File

@@ -30,15 +30,22 @@ package terrasaur.enums;
* *
*/ */
public enum FitsHeaderType { public enum FitsHeaderType {
ANCIGLOBALGENERIC,
ANCIGLOBALGENERIC, ANCILOCALGENERIC, ANCIGLOBALALTWG, ANCIG_FACETRELATION_ALTWG, ANCILOCALALTWG, DTMGLOBALALTWG, DTMLOCALALTWG, DTMGLOBALGENERIC, DTMLOCALGENERIC, NFTMLN; ANCILOCALGENERIC,
ANCIGLOBALALTWG,
ANCIG_FACETRELATION_ALTWG,
ANCILOCALALTWG,
DTMGLOBALALTWG,
DTMLOCALALTWG,
DTMGLOBALGENERIC,
DTMLOCALGENERIC,
NFTMLN;
public static boolean isGLobal(FitsHeaderType hdrType) { public static boolean isGLobal(FitsHeaderType hdrType) {
boolean globalProduct; boolean globalProduct;
switch (hdrType) { switch (hdrType) {
case ANCIGLOBALALTWG: case ANCIGLOBALALTWG:
case ANCIGLOBALGENERIC: case ANCIGLOBALGENERIC:
case DTMGLOBALALTWG: case DTMGLOBALALTWG:

View File

@@ -148,7 +148,6 @@ public enum PlaneInfo {
coreTilts.add(PlaneInfo.RELATIVE_HEIGHT); coreTilts.add(PlaneInfo.RELATIVE_HEIGHT);
return coreTilts; return coreTilts;
} }
/** /**
@@ -161,8 +160,7 @@ public enum PlaneInfo {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList) public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList) throws HeaderCardException {
throws HeaderCardException {
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>(); List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
String plane = "PLANE"; String plane = "PLANE";
int cc = 1; int cc = 1;
@@ -172,7 +170,5 @@ public enum PlaneInfo {
cc++; cc++;
} }
return planeHeaders; return planeHeaders;
} }
} }

View File

@@ -32,7 +32,6 @@ import com.google.common.base.Strings;
* *
*/ */
public enum SigmaFileType { public enum SigmaFileType {
SPCSIGMA { SPCSIGMA {
@Override @Override
@@ -118,7 +117,6 @@ public enum SigmaFileType {
public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) { public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) {
SigmaFileType sigmaType = SigmaFileType.NOMATCH; SigmaFileType sigmaType = SigmaFileType.NOMATCH;
switch (srcType) { switch (srcType) {
case SPC: case SPC:
sigmaType = SigmaFileType.SPCSIGMA; sigmaType = SigmaFileType.SPCSIGMA;
break; break;
@@ -130,7 +128,6 @@ public enum SigmaFileType {
default: default:
sigmaType = SigmaFileType.NOMATCH; sigmaType = SigmaFileType.NOMATCH;
break; break;
} }
return sigmaType; return sigmaType;

View File

@@ -30,14 +30,12 @@ package terrasaur.enums;
* *
*/ */
public enum SrcProductType { public enum SrcProductType {
SFM { SFM {
@Override @Override
public String getAltwgFrag() { public String getAltwgFrag() {
return "sfm"; return "sfm";
} }
}, },
SPC { SPC {
@@ -68,7 +66,6 @@ public enum SrcProductType {
public String getAltwgFrag() { public String getAltwgFrag() {
return "tru"; return "tru";
} }
}, },
UNKNOWN { UNKNOWN {
@@ -106,10 +103,8 @@ public enum SrcProductType {
public static SrcProductType fromAltwgFrag(String stringFrag) { public static SrcProductType fromAltwgFrag(String stringFrag) {
for (SrcProductType prodType : SrcProductType.values()) { for (SrcProductType prodType : SrcProductType.values()) {
if (prodType.getAltwgFrag().equals(stringFrag)) if (prodType.getAltwgFrag().equals(stringFrag)) return prodType;
return prodType;
} }
return UNKNOWN; return UNKNOWN;
} }
} }

View File

@@ -35,7 +35,16 @@ public enum AltPipelnEnum {
// controls whether or not certain global products get created. // controls whether or not certain global products get created.
// REQUIRED TO BE IN CONFIGFILE: // REQUIRED TO BE IN CONFIGFILE:
DOGLOBALSHAPE, OBJTOFITS, ADDOBJCOMMENTS, GLOBALRES0, DUMBERVALUES, DOGRAVGLOBAL, DOGLOBALGRAV_ANCI, DOGLOBALTILT_ANCI, DOGLOBALMISC_ANCI, DOGLOBALTILT, DOGLOBALSHAPE,
OBJTOFITS,
ADDOBJCOMMENTS,
GLOBALRES0,
DUMBERVALUES,
DOGRAVGLOBAL,
DOGLOBALGRAV_ANCI,
DOGLOBALTILT_ANCI,
DOGLOBALMISC_ANCI,
DOGLOBALTILT,
// controls number of slots per job to use when running global distributed gravity // controls number of slots per job to use when running global distributed gravity
// in grid engine mode. Does not apply if running in local parallel mode // in grid engine mode. Does not apply if running in local parallel mode
@@ -49,17 +58,26 @@ public enum AltPipelnEnum {
DSKERNEL, DSKERNEL,
// enable/disable the creation of global and local DSK files // enable/disable the creation of global and local DSK files
DOGLOBALDSK, DOLOCALDSK, DOGLOBALDSK,
DOLOCALDSK,
// global tilt semi-major axis in km. Now a required variable // global tilt semi-major axis in km. Now a required variable
GTILTMAJAXIS, GTILTMAJAXIS,
// settings for every local product generated by the pipeline // settings for every local product generated by the pipeline
USEOLDMAPLETS, DODTMFITSOBJ, DOGRAVLOCAL, GENLOCALGRAV, DOLOCALTILT, DOLOCAL_GRAVANCI, DOLOCAL_TILTANCI, DOLOCAL_MISCANCI, USEOLDMAPLETS,
DODTMFITSOBJ,
DOGRAVLOCAL,
GENLOCALGRAV,
DOLOCALTILT,
DOLOCAL_GRAVANCI,
DOLOCAL_TILTANCI,
DOLOCAL_MISCANCI,
// controls whether to use average gravitational potential for global and local gravity // controls whether to use average gravitational potential for global and local gravity
// calculations. =0 use minimum reference potential, =1 use average reference potential // calculations. =0 use minimum reference potential, =1 use average reference potential
AVGGRAVPOTGLOBAL, AVGGRAVPOTLOCAL, AVGGRAVPOTGLOBAL,
AVGGRAVPOTLOCAL,
// controls RunBigMap. // controls RunBigMap.
// integrate slope to height required. Defaults to "n" if this enum does not exist in config file. // integrate slope to height required. Defaults to "n" if this enum does not exist in config file.
@@ -73,18 +91,27 @@ public enum AltPipelnEnum {
// controls the source data, product destination, naming convention, // controls the source data, product destination, naming convention,
// and process flow. // and process flow.
DATASOURCE, REPORTTYPE, INDATADIR, OUTPUTDIR, PDSNAMING, RENAMEGLOBALOBJ, USEBIGMAPS, DOREPORT, DATASOURCE,
REPORTTYPE,
INDATADIR,
OUTPUTDIR,
PDSNAMING,
RENAMEGLOBALOBJ,
USEBIGMAPS,
DOREPORT,
// shape model density and rotation rate are now required variables. This way we can easily spot // shape model density and rotation rate are now required variables. This way we can easily spot
// what we are using // what we are using
// as defaults. // as defaults.
SMDENSITY, SMRRATE, SMDENSITY,
SMRRATE,
// stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct. // stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct.
NAMINGCONVENTION, NAMINGCONVENTION,
// set values that cannot be derived from data. // set values that cannot be derived from data.
REPORTBASENAME, VERSION, REPORTBASENAME,
VERSION,
// everything after this is not a required keyword // everything after this is not a required keyword
@@ -109,7 +136,8 @@ public enum AltPipelnEnum {
// (optional) controls whether or not STL files get generated. If these do not exist in the // (optional) controls whether or not STL files get generated. If these do not exist in the
// pipeConfig file then they will NOT // pipeConfig file then they will NOT
// get generated! // get generated!
GLOBALSTL, LOCALSTL, GLOBALSTL,
LOCALSTL,
// keywords for local products // keywords for local products
// //
@@ -120,11 +148,26 @@ public enum AltPipelnEnum {
// MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps. // MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps.
// defaults to auto-generated tiles if this is not specified. // defaults to auto-generated tiles if this is not specified.
// allow for pointers to different files for 30cm and 10cm map products. // allow for pointers to different files for 30cm and 10cm map products.
DOLOCALMAP, MAPDIR, MAPSCALE, MAPHALFSZ, REPORT, MAPSmallerSZ, MAPFILE, ISTAGSITE, LTILTMAJAXIS, LTILTMINAXIS, LTILTPA, MAXSCALE, DOLOCALMAP,
MAPDIR,
MAPSCALE,
MAPHALFSZ,
REPORT,
MAPSmallerSZ,
MAPFILE,
ISTAGSITE,
LTILTMAJAXIS,
LTILTMINAXIS,
LTILTPA,
MAXSCALE,
// settings for local tag sites. Note TAGSFILE is not optional. // settings for local tag sites. Note TAGSFILE is not optional.
// it contains the tagsite name and lat,lon of tag site tile center // it contains the tagsite name and lat,lon of tag site tile center
TAGDIR, TAGSCALE, TAGHALFSZ, TAGSFILE, REPORTTAG, TAGDIR,
TAGSCALE,
TAGHALFSZ,
TAGSFILE,
REPORTTAG,
// pointer to OLA database. only required if DATASOURCE is OLA // pointer to OLA database. only required if DATASOURCE is OLA
OLADBPATH, OLADBPATH,
@@ -152,12 +195,18 @@ public enum AltPipelnEnum {
* parameters. The pipeline will use default values for these enums if they are not defined in the * parameters. The pipeline will use default values for these enums if they are not defined in the
* pipeline configuration file. * pipeline configuration file.
*/ */
GALGORITHM, GRAVCONST, GTILTMINAXIS, GTILTPA, MASSUNCERT, VSEARCHRAD_PCTGSD, FSEARCHRAD_PCTGSD, PXPERDEG, GALGORITHM,
GRAVCONST,
GTILTMINAXIS,
GTILTPA,
MASSUNCERT,
VSEARCHRAD_PCTGSD,
FSEARCHRAD_PCTGSD,
PXPERDEG,
// The following are options to subvert normal pipeline operations or to configure pipeline for // The following are options to subvert normal pipeline operations or to configure pipeline for
// other missions // other missions
// global objs are supplied at all resolutions as the starting point. // global objs are supplied at all resolutions as the starting point.
// This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls // This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls
OBJASINPUT, OBJASINPUT,
@@ -191,7 +240,8 @@ public enum AltPipelnEnum {
OBJCOMHEADER, OBJCOMHEADER,
// identifies whether there are poisson reconstruct data products to include in the webpage report // identifies whether there are poisson reconstruct data products to include in the webpage report
LOCALPOISSON, GLOBALPOISSON; LOCALPOISSON,
GLOBALPOISSON;
/* /*
* The following enumerations are required to exist in the altwg pipeline config file. * The following enumerations are required to exist in the altwg pipeline config file.
@@ -215,8 +265,7 @@ public enum AltPipelnEnum {
public static AltPipelnEnum fromString(String textString) { public static AltPipelnEnum fromString(String textString) {
for (AltPipelnEnum anciType : AltPipelnEnum.values()) { for (AltPipelnEnum anciType : AltPipelnEnum.values()) {
if (anciType.toString().equals(textString)) if (anciType.toString().equals(textString)) return anciType;
return anciType;
} }
return null; return null;
} }
@@ -246,7 +295,6 @@ public enum AltPipelnEnum {
if (parsedVal > 0) { if (parsedVal > 0) {
returnFlag = true; returnFlag = true;
} }
} }
return returnFlag; return returnFlag;
} }
@@ -314,11 +362,9 @@ public enum AltPipelnEnum {
default: default:
return "NA"; return "NA";
} }
} }
return "NA"; return "NA";
}
}; ;
} }

View File

@@ -56,7 +56,6 @@ public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
headers.addAll(getSpecificInfo("product specific")); headers.addAll(getSpecificInfo("product specific"));
return headers; return headers;
} }
/** /**
@@ -75,8 +74,5 @@ public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
} }

View File

@@ -57,7 +57,6 @@ public class AltwgAnciGlobalFacetRelation extends AnciTableFits implements AnciF
headers.addAll(getSpecificInfo("product specific")); headers.addAll(getSpecificInfo("product specific"));
return headers; return headers;
} }
/** /**

View File

@@ -56,7 +56,6 @@ public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
headers.addAll(getSpecificInfo("product specific")); headers.addAll(getSpecificInfo("product specific"));
return headers; return headers;
} }
/** /**
@@ -75,8 +74,5 @@ public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
} }

View File

@@ -63,7 +63,6 @@ public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
/** /**
@@ -83,10 +82,8 @@ public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS)); headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers; return headers;
} }
/** /**
* Added GSDI - specific to OREX-SPOC. * Added GSDI - specific to OREX-SPOC.
*/ */
@@ -105,5 +102,4 @@ public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
return headers; return headers;
} }
} }

View File

@@ -63,7 +63,6 @@ public class AltwgLocalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
@Override @Override
@@ -129,7 +128,6 @@ public class AltwgLocalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS)); headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers; return headers;
} }
/** /**
@@ -150,5 +148,4 @@ public class AltwgLocalDTM extends DTMFits implements DTMHeader {
return headers; return headers;
} }
} }

View File

@@ -23,12 +23,10 @@
package terrasaur.fits; package terrasaur.fits;
import java.util.List; import java.util.List;
import nom.tam.fits.HeaderCard; import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException; import nom.tam.fits.HeaderCardException;
public interface AnciFitsHeader { public interface AnciFitsHeader {
public List<HeaderCard> createFitsHeader() throws HeaderCardException; public List<HeaderCard> createFitsHeader() throws HeaderCardException;
} }

View File

@@ -60,7 +60,6 @@ public abstract class AnciTableFits {
headers.addAll(getSpatialInfo("summary spatial information")); headers.addAll(getSpatialInfo("summary spatial information"));
return headers; return headers;
} }
/** /**
@@ -78,7 +77,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS)); headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers; return headers;
} }
/** /**
@@ -99,7 +97,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN)); headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers; return headers;
} }
/** /**
@@ -117,7 +114,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
/** /**
@@ -141,7 +137,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR)); headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers; return headers;
} }
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException { public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
@@ -179,7 +174,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER)); headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers; return headers;
} }
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException { public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
@@ -285,7 +279,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
return headers; return headers;
} }
public List<HeaderCard> getUY() throws HeaderCardException { public List<HeaderCard> getUY() throws HeaderCardException {
@@ -296,7 +289,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers; return headers;
} }
public List<HeaderCard> getUZ() throws HeaderCardException { public List<HeaderCard> getUZ() throws HeaderCardException {
@@ -307,7 +299,5 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
return headers; return headers;
} }
} }

View File

@@ -97,7 +97,6 @@ public abstract class DTMFits {
List<HeaderCard> headers = new ArrayList<HeaderCard>(); List<HeaderCard> headers = new ArrayList<HeaderCard>();
return headers; return headers;
} }
/** /**
@@ -118,7 +117,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN)); headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers; return headers;
} }
/** /**
@@ -136,7 +134,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
/** /**
@@ -160,7 +157,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers; return headers;
} }
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException { public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
@@ -197,7 +193,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER)); headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers; return headers;
} }
/** /**
@@ -238,8 +233,7 @@ public abstract class DTMFits {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList) public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList) throws HeaderCardException {
throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>(); List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) { if (comment.length() > 0) {
@@ -248,8 +242,8 @@ public abstract class DTMFits {
headers.addAll(planeList); headers.addAll(planeList);
if (!dataContained) { if (!dataContained) {
String errMesg = "ERROR! Cannot return keywords describing the DTM cube without " String errMesg =
+ "having the actual data!"; "ERROR! Cannot return keywords describing the DTM cube without " + "having the actual data!";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
@@ -260,8 +254,8 @@ public abstract class DTMFits {
for (HeaderCard thisPlane : planeList) { for (HeaderCard thisPlane : planeList) {
System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue()); System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue());
} }
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube " String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube " + " has "
+ " has " + fitsData.getData().length + " planes"; + fitsData.getData().length + " planes";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
@@ -341,7 +335,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
return headers; return headers;
} }
public List<HeaderCard> getUY() throws HeaderCardException { public List<HeaderCard> getUY() throws HeaderCardException {
@@ -352,7 +345,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers; return headers;
} }
public List<HeaderCard> getUZ() throws HeaderCardException { public List<HeaderCard> getUZ() throws HeaderCardException {
@@ -363,11 +355,9 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
return headers; return headers;
} }
public HeaderCard getEnd() throws HeaderCardException { public HeaderCard getEnd() throws HeaderCardException {
return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment()); return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment());
} }
} }

View File

@@ -124,7 +124,6 @@ public class FitsData {
} }
} }
public static class FitsDataBuilder { public static class FitsDataBuilder {
private final double[][][] data; private final double[][][] data;
private double[] V = null; private double[] V = null;
@@ -189,7 +188,6 @@ public class FitsData {
default: default:
throw new RuntimeException(); throw new RuntimeException();
} }
return this; return this;
} }

View File

@@ -33,12 +33,12 @@ import java.util.LinkedHashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.TimeZone; import java.util.TimeZone;
import org.apache.commons.io.FileUtils;
import nom.tam.fits.FitsException; import nom.tam.fits.FitsException;
import nom.tam.fits.Header; import nom.tam.fits.Header;
import nom.tam.fits.HeaderCard; import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException; import nom.tam.fits.HeaderCardException;
import nom.tam.util.Cursor; import nom.tam.util.Cursor;
import org.apache.commons.io.FileUtils;
import picante.math.coords.CoordConverters; import picante.math.coords.CoordConverters;
import picante.math.coords.LatitudinalVector; import picante.math.coords.LatitudinalVector;
import picante.math.vectorspace.UnwritableVectorIJK; import picante.math.vectorspace.UnwritableVectorIJK;
@@ -83,9 +83,7 @@ public class FitsHdr {
* basic no-arg constructor. Note that this constructor does NOT populate the fitsKV map! If you * basic no-arg constructor. Note that this constructor does NOT populate the fitsKV map! If you
* want an initial builder with initialized map then call initHdrBuilder(). * want an initial builder with initialized map then call initHdrBuilder().
*/ */
public FitsHdrBuilder() { public FitsHdrBuilder() {}
}
/** /**
* Return all HeaderCards in their current state * Return all HeaderCards in their current state
@@ -164,7 +162,6 @@ public class FitsHdr {
setVCbyHeaderTag(HeaderTag.LRCLAT, "N/A", HeaderTag.LRCLAT.comment()); setVCbyHeaderTag(HeaderTag.LRCLAT, "N/A", HeaderTag.LRCLAT.comment());
setVCbyHeaderTag(HeaderTag.ULCLNG, "N/A", HeaderTag.ULCLNG.comment()); setVCbyHeaderTag(HeaderTag.ULCLNG, "N/A", HeaderTag.ULCLNG.comment());
setVCbyHeaderTag(HeaderTag.ULCLAT, "N/A", HeaderTag.ULCLAT.comment()); setVCbyHeaderTag(HeaderTag.ULCLAT, "N/A", HeaderTag.ULCLAT.comment());
} }
return this; return this;
@@ -179,10 +176,8 @@ public class FitsHdr {
*/ */
public FitsHdrBuilder setCenterLatLon(LatitudinalVector llr) throws HeaderCardException { public FitsHdrBuilder setCenterLatLon(LatitudinalVector llr) throws HeaderCardException {
if (llr != null) { if (llr != null) {
setVCbyHeaderTag(HeaderTag.CLON, Math.toDegrees(llr.getLongitude()), setVCbyHeaderTag(HeaderTag.CLON, Math.toDegrees(llr.getLongitude()), HeaderTag.CLON.comment());
HeaderTag.CLON.comment()); setVCbyHeaderTag(HeaderTag.CLAT, Math.toDegrees(llr.getLatitude()), HeaderTag.CLAT.comment());
setVCbyHeaderTag(HeaderTag.CLAT, Math.toDegrees(llr.getLatitude()),
HeaderTag.CLAT.comment());
} }
return this; return this;
} }
@@ -247,7 +242,6 @@ public class FitsHdr {
String errMesg = "ERROR! Center Vector is null! It could be that center" String errMesg = "ERROR! Center Vector is null! It could be that center"
+ " vector information is already contained in FitsHdrBuilder!"; + " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
return this; return this;
} }
@@ -289,7 +283,6 @@ public class FitsHdr {
String errMesg = "ERROR! UX Vector is null! It could be that UX" String errMesg = "ERROR! UX Vector is null! It could be that UX"
+ " vector information is already contained in FitsHdrBuilder!"; + " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
return this; return this;
@@ -313,11 +306,9 @@ public class FitsHdr {
String errMesg = "ERROR! UY Vector is null! It could be that UY" String errMesg = "ERROR! UY Vector is null! It could be that UY"
+ " vector information is already contained in FitsHdrBuilder!"; + " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
return this; return this;
} }
/** /**
@@ -338,11 +329,9 @@ public class FitsHdr {
String errMesg = "ERROR! UZ Vector is null! It could be that UZ" String errMesg = "ERROR! UZ Vector is null! It could be that UZ"
+ " vector information is already contained in FitsHdrBuilder!"; + " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
return this; return this;
} }
/** /**
@@ -382,8 +371,7 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, String value) public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, String value) throws HeaderCardException {
throws HeaderCardException {
if (fitsKV.containsKey(hdrTag.toString())) { if (fitsKV.containsKey(hdrTag.toString())) {
fitsKV.get(hdrTag.toString()).setValue(value); fitsKV.get(hdrTag.toString()).setValue(value);
@@ -409,8 +397,7 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, double value) public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, double value) throws HeaderCardException {
throws HeaderCardException {
String hdrKey = hdrTag.toString(); String hdrKey = hdrTag.toString();
if (fitsKV.containsKey(hdrKey)) { if (fitsKV.containsKey(hdrKey)) {
@@ -453,10 +440,8 @@ public class FitsHdr {
} }
HeaderCard newCard = new HeaderCard(hdrKey, value, comment); HeaderCard newCard = new HeaderCard(hdrKey, value, comment);
fitsKV.put(hdrKey, newCard); fitsKV.put(hdrKey, newCard);
} }
return this; return this;
} }
/** /**
@@ -477,7 +462,6 @@ public class FitsHdr {
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
return this; return this;
} }
/** /**
@@ -507,7 +491,6 @@ public class FitsHdr {
fitsKV.put(hdrKey, newCard); fitsKV.put(hdrKey, newCard);
} }
return this; return this;
} }
/** /**
@@ -525,7 +508,6 @@ public class FitsHdr {
System.out.println( System.out.println(
"WARNING! header:" + hdrKey + " is new. Will add it to " + "fits header builder."); "WARNING! header:" + hdrKey + " is new. Will add it to " + "fits header builder.");
} }
} }
fitsKV.put(hdrKey, headerCard); fitsKV.put(hdrKey, headerCard);
@@ -582,9 +564,9 @@ public class FitsHdr {
if (fitsData.hasUnitv()) { if (fitsData.hasUnitv()) {
// assume all unit vectors are present // assume all unit vectors are present
this.setUx(fitsData.getUnit(UnitDir.UX)).setUy(fitsData.getUnit(UnitDir.UY)) this.setUx(fitsData.getUnit(UnitDir.UX))
.setUy(fitsData.getUnit(UnitDir.UY))
.setUz(fitsData.getUnit(UnitDir.UZ)); .setUz(fitsData.getUnit(UnitDir.UZ));
} }
if (fitsData.hasGsd()) { if (fitsData.hasGsd()) {
@@ -638,15 +620,13 @@ public class FitsHdr {
public void addCard(HeaderTag headerTag) throws HeaderCardException { public void addCard(HeaderTag headerTag) throws HeaderCardException {
if (this.containsKey(headerTag)) { if (this.containsKey(headerTag)) {
System.out.println("WARNING: keyword:" + headerTag.toString() System.out.println("WARNING: keyword:" + headerTag.toString() + " already exists in map. Will not"
+ " already exists in map. Will not" + " add a default headercard!"); + " add a default headercard!");
} else { } else {
HeaderCard newCard = HeaderCard newCard = new HeaderCard(headerTag.toString(), headerTag.value(), headerTag.comment());
new HeaderCard(headerTag.toString(), headerTag.value(), headerTag.comment());
fitsKV.put(headerTag.toString(), newCard); fitsKV.put(headerTag.toString(), newCard);
} }
} }
public FitsHdr build() { public FitsHdr build() {
@@ -661,8 +641,7 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public static FitsHdrBuilder checkGlobalLatLon(FitsHdrBuilder thisBuilder) public static FitsHdrBuilder checkGlobalLatLon(FitsHdrBuilder thisBuilder) throws HeaderCardException {
throws HeaderCardException {
// check if this keyword exists first. If not, then add all the corner keywords! // check if this keyword exists first. If not, then add all the corner keywords!
if (!thisBuilder.containsKey(HeaderTag.LLCLNG)) { if (!thisBuilder.containsKey(HeaderTag.LLCLNG)) {
@@ -687,17 +666,14 @@ public class FitsHdr {
thisBuilder.setVbyHeaderTag(HeaderTag.URCLAT, 90D); thisBuilder.setVbyHeaderTag(HeaderTag.URCLAT, 90D);
thisBuilder.setVbyHeaderTag(HeaderTag.LRCLNG, 180D); thisBuilder.setVbyHeaderTag(HeaderTag.LRCLNG, 180D);
thisBuilder.setVbyHeaderTag(HeaderTag.LRCLAT, -90D); thisBuilder.setVbyHeaderTag(HeaderTag.LRCLAT, -90D);
} }
// check if this keyword exists first. If not then add the center lat,lon keywords! // check if this keyword exists first. If not then add the center lat,lon keywords!
if (!thisBuilder.containsKey(HeaderTag.CLAT)) { if (!thisBuilder.containsKey(HeaderTag.CLAT)) {
thisBuilder.addCard(HeaderTag.CLAT); thisBuilder.addCard(HeaderTag.CLAT);
thisBuilder.addCard(HeaderTag.CLON); thisBuilder.addCard(HeaderTag.CLON);
} }
if ((!validLatLon(thisBuilder, HeaderTag.CLAT)) if ((!validLatLon(thisBuilder, HeaderTag.CLAT)) || (!validLatLon(thisBuilder, HeaderTag.CLON))) {
|| (!validLatLon(thisBuilder, HeaderTag.CLON))) {
System.out.println("WARNING! check of global center keywords shows they" System.out.println("WARNING! check of global center keywords shows they"
+ "have not been set! Setting to default global values (0,0)"); + "have not been set! Setting to default global values (0,0)");
LatitudinalVector llr = new LatitudinalVector(1., 0., 0.); LatitudinalVector llr = new LatitudinalVector(1., 0., 0.);
@@ -739,7 +715,6 @@ public class FitsHdr {
corner = HeaderTag.CLON; corner = HeaderTag.CLON;
throwBadLatLon(thisBuilder, corner); throwBadLatLon(thisBuilder, corner);
} }
private static boolean validLatLon(FitsHdrBuilder thisBuilder, HeaderTag cornerTag) { private static boolean validLatLon(FitsHdrBuilder thisBuilder, HeaderTag cornerTag) {
@@ -755,13 +730,10 @@ public class FitsHdr {
if (!validLatLon(thisBuilder, cornerTag)) { if (!validLatLon(thisBuilder, cornerTag)) {
String chkVal = thisBuilder.getCard(cornerTag).getValue(); String chkVal = thisBuilder.getCard(cornerTag).getValue();
String errMesg = String errMesg = ("ERROR!" + cornerTag.toString() + " in FitsHdrBuilder not valid:" + chkVal);
("ERROR!" + cornerTag.toString() + " in FitsHdrBuilder not valid:" + chkVal);
throw new RuntimeException(errMesg); throw new RuntimeException(errMesg);
} }
} }
} // end static class FitsHdrBuilder } // end static class FitsHdrBuilder
/** /**
@@ -787,7 +759,6 @@ public class FitsHdr {
+ fitsFile.toString() + " for fits header!"; + fitsFile.toString() + " for fits header!";
System.err.println(errmesg); System.err.println(errmesg);
System.exit(1); System.exit(1);
} }
return hdrBuilder; return hdrBuilder;
} }
@@ -803,8 +774,7 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public static FitsHdrBuilder copyFitsHeader(File fitsFile, boolean initHdr) public static FitsHdrBuilder copyFitsHeader(File fitsFile, boolean initHdr) throws HeaderCardException {
throws HeaderCardException {
FitsHdrBuilder hdrBuilder = new FitsHdrBuilder(); FitsHdrBuilder hdrBuilder = new FitsHdrBuilder();
try { try {
@@ -817,12 +787,10 @@ public class FitsHdr {
+ fitsFile.toString() + " for fits header!"; + fitsFile.toString() + " for fits header!";
System.err.println(errmesg); System.err.println(errmesg);
System.exit(1); System.exit(1);
} }
return hdrBuilder; return hdrBuilder;
} }
/** /**
* Loops through map of fits header cards and use it to populate and return FitsHdrBuilder. There * Loops through map of fits header cards and use it to populate and return FitsHdrBuilder. There
* are NO checks to see if the keywords match those in the toolkit! * are NO checks to see if the keywords match those in the toolkit!
@@ -857,8 +825,7 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public static FitsHdrBuilder copyFitsHeader(Map<String, HeaderCard> map) public static FitsHdrBuilder copyFitsHeader(Map<String, HeaderCard> map) throws HeaderCardException {
throws HeaderCardException {
// initialize hdrBuilder with the full set of keywords specified in HeaderTag // initialize hdrBuilder with the full set of keywords specified in HeaderTag
boolean initHdr = false; boolean initHdr = false;
@@ -876,8 +843,8 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public static FitsHdrBuilder addToFitsHeader(Map<String, HeaderCard> map, public static FitsHdrBuilder addToFitsHeader(Map<String, HeaderCard> map, FitsHdrBuilder hdrBuilder)
FitsHdrBuilder hdrBuilder) throws HeaderCardException { throws HeaderCardException {
if (hdrBuilder == null) { if (hdrBuilder == null) {
hdrBuilder = new FitsHdrBuilder(); hdrBuilder = new FitsHdrBuilder();
@@ -888,7 +855,6 @@ public class FitsHdr {
hdrBuilder.setbyHeaderCard(thisCard); hdrBuilder.setbyHeaderCard(thisCard);
} }
return hdrBuilder; return hdrBuilder;
} }
/** /**
@@ -899,8 +865,7 @@ public class FitsHdr {
* @return * @return
* @throws HeaderCardException * @throws HeaderCardException
*/ */
public static FitsHdrBuilder addToFitsHeader(File fitsFile, FitsHdrBuilder hdrBuilder) public static FitsHdrBuilder addToFitsHeader(File fitsFile, FitsHdrBuilder hdrBuilder) throws HeaderCardException {
throws HeaderCardException {
if (hdrBuilder == null) { if (hdrBuilder == null) {
hdrBuilder = new FitsHdrBuilder(); hdrBuilder = new FitsHdrBuilder();
@@ -917,7 +882,6 @@ public class FitsHdr {
+ fitsFile.toString() + " for fits header!"; + fitsFile.toString() + " for fits header!";
System.err.println(errmesg); System.err.println(errmesg);
System.exit(1); System.exit(1);
} }
return hdrBuilder; return hdrBuilder;
} }
@@ -931,8 +895,7 @@ public class FitsHdr {
public static FitsHdrBuilder initHdrBuilder() throws HeaderCardException { public static FitsHdrBuilder initHdrBuilder() throws HeaderCardException {
FitsHdrBuilder builder = new FitsHdrBuilder(); FitsHdrBuilder builder = new FitsHdrBuilder();
for (HeaderTag tag : HeaderTag.fitsKeywords) { for (HeaderTag tag : HeaderTag.fitsKeywords) {
builder.fitsKV.put(tag.toString(), builder.fitsKV.put(tag.toString(), new HeaderCard(tag.toString(), tag.value(), tag.comment()));
new HeaderCard(tag.toString(), tag.value(), tag.comment()));
} }
return builder; return builder;
} }
@@ -957,8 +920,8 @@ public class FitsHdr {
} }
if (hdrBuilder == null) { if (hdrBuilder == null) {
System.out.println("builder passed to FitsHeader.configHdrBuilder() is null. Generating" System.out.println(
+ " new FitsHeaderBuilder"); "builder passed to FitsHeader.configHdrBuilder() is null. Generating" + " new FitsHeaderBuilder");
hdrBuilder = new FitsHdrBuilder(); hdrBuilder = new FitsHdrBuilder();
} }
List<String> content = FileUtils.readLines(new File(configFile), Charset.defaultCharset()); List<String> content = FileUtils.readLines(new File(configFile), Charset.defaultCharset());
@@ -987,8 +950,8 @@ public class FitsHdr {
// user explicitly wants to override any comment in this header with null // user explicitly wants to override any comment in this header with null
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], null); hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], null);
} else { } else {
System.out.println("setting " + thisTag.toString() + " to " + keyval[1] System.out.println(
+ ", comment to " + keyval[2]); "setting " + thisTag.toString() + " to " + keyval[1] + ", comment to " + keyval[2]);
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], keyval[2]); hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], keyval[2]);
} }
} else { } else {
@@ -997,9 +960,7 @@ public class FitsHdr {
System.out.println(line); System.out.println(line);
System.out.println("Cannot parse. skipping this line"); System.out.println("Cannot parse. skipping this line");
} }
} }
} }
} }
if (!separatorFound) { if (!separatorFound) {
@@ -1054,8 +1015,8 @@ public class FitsHdr {
return fitsKV.get(keyword); return fitsKV.get(keyword);
} else { } else {
System.out.println("WARNING! keyword:" + keyword + " not set in fits header builder." System.out.println(
+ " using default value."); "WARNING! keyword:" + keyword + " not set in fits header builder." + " using default value.");
return new HeaderCard(tag.toString(), tag.value(), tag.comment()); return new HeaderCard(tag.toString(), tag.value(), tag.comment());
} }
} }
@@ -1075,18 +1036,16 @@ public class FitsHdr {
String keyword = tag.toString(); String keyword = tag.toString();
if (fitsKV.containsKey(keyword)) { if (fitsKV.containsKey(keyword)) {
HeaderCard thisCard = fitsKV.get(keyword); HeaderCard thisCard = fitsKV.get(keyword);
return new HeaderCard(thisCard.getKey(), return new HeaderCard(
thisCard.getKey(),
Double.parseDouble(thisCard.getValue().replace('d', 'e').replace('D', 'e')), Double.parseDouble(thisCard.getValue().replace('d', 'e').replace('D', 'e')),
thisCard.getComment()); thisCard.getComment());
} else { } else {
System.out.println( System.out.println("WARNING! keyword:" + keyword + " not set in FitsHdr," + " using default value.");
"WARNING! keyword:" + keyword + " not set in FitsHdr," + " using default value.");
return new HeaderCard(keyword, tag.value(), tag.comment()); return new HeaderCard(keyword, tag.value(), tag.comment());
} }
} }
/** /**
@@ -1101,15 +1060,12 @@ public class FitsHdr {
if (fitsKV.containsKey(keyword)) { if (fitsKV.containsKey(keyword)) {
HeaderCard thisCard = fitsKV.get(keyword); HeaderCard thisCard = fitsKV.get(keyword);
double value = Double.parseDouble(thisCard.getValue()); double value = Double.parseDouble(thisCard.getValue());
if (Double.isFinite(value)) if (Double.isFinite(value)) value = Double.parseDouble(String.format(fmtS, value));
value = Double.parseDouble(String.format(fmtS, value));
return new HeaderCard(thisCard.getKey(), value, thisCard.getComment()); return new HeaderCard(thisCard.getKey(), value, thisCard.getComment());
} else { } else {
String mesg = String mesg = "WARNING! keyword:" + tag.toString() + " not set in FitsHdr," + " using default value.";
"WARNING! keyword:" + tag.toString() + " not set in FitsHdr," + " using default value.";
System.out.println(mesg); System.out.println(mesg);
return new HeaderCard(keyword, tag.value(), tag.comment()); return new HeaderCard(keyword, tag.value(), tag.comment());
} }
} }
@@ -1136,7 +1092,5 @@ public class FitsHdr {
} }
return matchCard; return matchCard;
} }
} }

View File

@@ -28,10 +28,10 @@ import java.nio.charset.Charset;
import java.util.EnumMap; import java.util.EnumMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import org.apache.commons.io.FileUtils;
import nom.tam.fits.FitsException; import nom.tam.fits.FitsException;
import nom.tam.fits.HeaderCard; import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException; import nom.tam.fits.HeaderCardException;
import org.apache.commons.io.FileUtils;
import terrasaur.utils.StringUtil; import terrasaur.utils.StringUtil;
/** /**
@@ -211,55 +211,33 @@ public class FitsHeader {
private FitsValCom nAxis2 = new FitsValCom("1024", null); private FitsValCom nAxis2 = new FitsValCom("1024", null);
private FitsValCom nAxis3 = new FitsValCom("numberPlanesNotSet", null); private FitsValCom nAxis3 = new FitsValCom("numberPlanesNotSet", null);
private FitsValCom hdrVers = private FitsValCom hdrVers = new FitsValCom(HeaderTag.HDRVERS.value(), HeaderTag.HDRVERS.comment());
new FitsValCom(HeaderTag.HDRVERS.value(), HeaderTag.HDRVERS.comment()); private FitsValCom mission = new FitsValCom(HeaderTag.MISSION.value(), HeaderTag.MISSION.comment());
private FitsValCom mission =
new FitsValCom(HeaderTag.MISSION.value(), HeaderTag.MISSION.comment());
private FitsValCom hostName = new FitsValCom(HeaderTag.HOSTNAME.value(), null); private FitsValCom hostName = new FitsValCom(HeaderTag.HOSTNAME.value(), null);
private FitsValCom target = new FitsValCom(HeaderTag.TARGET.value(), null); private FitsValCom target = new FitsValCom(HeaderTag.TARGET.value(), null);
private FitsValCom origin = private FitsValCom origin = new FitsValCom(HeaderTag.ORIGIN.value(), HeaderTag.ORIGIN.comment());
new FitsValCom(HeaderTag.ORIGIN.value(), HeaderTag.ORIGIN.comment()); private FitsValCom spocid = new FitsValCom(HeaderTag.SPOC_ID.value(), HeaderTag.SPOC_ID.comment());
private FitsValCom spocid = private FitsValCom sdparea = new FitsValCom(HeaderTag.SDPAREA.value(), HeaderTag.SDPAREA.comment());
new FitsValCom(HeaderTag.SPOC_ID.value(), HeaderTag.SPOC_ID.comment()); private FitsValCom sdpdesc = new FitsValCom(HeaderTag.SDPDESC.value(), HeaderTag.SDPDESC.comment());
private FitsValCom sdparea = private FitsValCom missionPhase = new FitsValCom(HeaderTag.MPHASE.value(), HeaderTag.MPHASE.comment());
new FitsValCom(HeaderTag.SDPAREA.value(), HeaderTag.SDPAREA.comment()); private FitsValCom dataSource = new FitsValCom(HeaderTag.DATASRC.value(), HeaderTag.DATASRC.comment());
private FitsValCom sdpdesc = private FitsValCom dataSourceFile = new FitsValCom(HeaderTag.DATASRCF.value(), HeaderTag.DATASRCF.comment());
new FitsValCom(HeaderTag.SDPDESC.value(), HeaderTag.SDPDESC.comment()); private FitsValCom dataSourceS = new FitsValCom(HeaderTag.DATASRCS.value(), HeaderTag.DATASRCS.comment());
private FitsValCom missionPhase = private FitsValCom dataSourceV = new FitsValCom(HeaderTag.DATASRCV.value(), HeaderTag.DATASRCV.comment());
new FitsValCom(HeaderTag.MPHASE.value(), HeaderTag.MPHASE.comment()); private FitsValCom software = new FitsValCom(HeaderTag.SOFTWARE.value(), HeaderTag.SOFTWARE.comment());
private FitsValCom dataSource = private FitsValCom softver = new FitsValCom(HeaderTag.SOFT_VER.value(), HeaderTag.SOFT_VER.comment());
new FitsValCom(HeaderTag.DATASRC.value(), HeaderTag.DATASRC.comment()); private FitsValCom datasrcd = new FitsValCom(HeaderTag.DATASRCD.value(), HeaderTag.DATASRCD.comment());
private FitsValCom dataSourceFile = private FitsValCom productName = new FitsValCom(HeaderTag.PRODNAME.value(), HeaderTag.PRODNAME.comment());
new FitsValCom(HeaderTag.DATASRCF.value(), HeaderTag.DATASRCF.comment()); private FitsValCom dateprd = new FitsValCom(HeaderTag.DATEPRD.value(), HeaderTag.DATEPRD.comment());
private FitsValCom dataSourceS =
new FitsValCom(HeaderTag.DATASRCS.value(), HeaderTag.DATASRCS.comment());
private FitsValCom dataSourceV =
new FitsValCom(HeaderTag.DATASRCV.value(), HeaderTag.DATASRCV.comment());
private FitsValCom software =
new FitsValCom(HeaderTag.SOFTWARE.value(), HeaderTag.SOFTWARE.comment());
private FitsValCom softver =
new FitsValCom(HeaderTag.SOFT_VER.value(), HeaderTag.SOFT_VER.comment());
private FitsValCom datasrcd =
new FitsValCom(HeaderTag.DATASRCD.value(), HeaderTag.DATASRCD.comment());
private FitsValCom productName =
new FitsValCom(HeaderTag.PRODNAME.value(), HeaderTag.PRODNAME.comment());
private FitsValCom dateprd =
new FitsValCom(HeaderTag.DATEPRD.value(), HeaderTag.DATEPRD.comment());
private FitsValCom productType = new FitsValCom("productTypeNotSet", null); private FitsValCom productType = new FitsValCom("productTypeNotSet", null);
private FitsValCom productVer = private FitsValCom productVer = new FitsValCom(HeaderTag.PRODVERS.value(), HeaderTag.PRODVERS.comment());
new FitsValCom(HeaderTag.PRODVERS.value(), HeaderTag.PRODVERS.comment()); private FitsValCom mapVer = new FitsValCom(HeaderTag.MAP_VER.value(), HeaderTag.MAP_VER.comment());
private FitsValCom mapVer =
new FitsValCom(HeaderTag.MAP_VER.value(), HeaderTag.MAP_VER.comment());
private FitsValCom objFile = private FitsValCom objFile = new FitsValCom(HeaderTag.OBJ_FILE.value(), HeaderTag.OBJ_FILE.comment());
new FitsValCom(HeaderTag.OBJ_FILE.value(), HeaderTag.OBJ_FILE.comment()); private FitsValCom author = new FitsValCom(HeaderTag.CREATOR.value(), HeaderTag.CREATOR.comment());
private FitsValCom author =
new FitsValCom(HeaderTag.CREATOR.value(), HeaderTag.CREATOR.comment());
private FitsValCom mapName = private FitsValCom mapName = new FitsValCom(HeaderTag.MAP_NAME.value(), HeaderTag.MAP_NAME.comment());
new FitsValCom(HeaderTag.MAP_NAME.value(), HeaderTag.MAP_NAME.comment()); private FitsValCom mapType = new FitsValCom(HeaderTag.MAP_TYPE.value(), HeaderTag.MAP_TYPE.comment());
private FitsValCom mapType =
new FitsValCom(HeaderTag.MAP_TYPE.value(), HeaderTag.MAP_TYPE.comment());
private FitsValCom clon = new FitsValCom("-999", HeaderTag.CLON.comment()); private FitsValCom clon = new FitsValCom("-999", HeaderTag.CLON.comment());
private FitsValCom clat = new FitsValCom("-999", HeaderTag.CLAT.comment()); private FitsValCom clat = new FitsValCom("-999", HeaderTag.CLAT.comment());
private FitsValCom minlon = new FitsValCom("-999", HeaderTag.MINLON.comment()); private FitsValCom minlon = new FitsValCom("-999", HeaderTag.MINLON.comment());
@@ -293,8 +271,7 @@ public class FitsHeader {
private FitsValCom sigDef = new FitsValCom(HeaderTag.SIGMA.value(), HeaderTag.SIGMA.comment()); private FitsValCom sigDef = new FitsValCom(HeaderTag.SIGMA.value(), HeaderTag.SIGMA.comment());
private FitsValCom dqual1 = new FitsValCom("-999", HeaderTag.DQUAL_1.comment()); private FitsValCom dqual1 = new FitsValCom("-999", HeaderTag.DQUAL_1.comment());
private FitsValCom dqual2 = new FitsValCom("-999", HeaderTag.DQUAL_2.comment()); private FitsValCom dqual2 = new FitsValCom("-999", HeaderTag.DQUAL_2.comment());
private FitsValCom dsigDef = private FitsValCom dsigDef = new FitsValCom(HeaderTag.DSIG_DEF.value(), HeaderTag.DSIG_DEF.comment());
new FitsValCom(HeaderTag.DSIG_DEF.value(), HeaderTag.DSIG_DEF.comment());
private FitsValCom density = new FitsValCom("-999", HeaderTag.DENSITY.comment()); private FitsValCom density = new FitsValCom("-999", HeaderTag.DENSITY.comment());
private FitsValCom rotRate = new FitsValCom("-999", HeaderTag.ROT_RATE.comment()); private FitsValCom rotRate = new FitsValCom("-999", HeaderTag.ROT_RATE.comment());
private FitsValCom refPot = new FitsValCom("-999", HeaderTag.REF_POT.comment()); private FitsValCom refPot = new FitsValCom("-999", HeaderTag.REF_POT.comment());
@@ -303,8 +280,7 @@ public class FitsHeader {
private FitsValCom tiltMin = new FitsValCom("-999", HeaderTag.TILT_MIN.comment()); private FitsValCom tiltMin = new FitsValCom("-999", HeaderTag.TILT_MIN.comment());
private FitsValCom tiltPa = new FitsValCom("0", HeaderTag.TILT_PA.comment()); private FitsValCom tiltPa = new FitsValCom("0", HeaderTag.TILT_PA.comment());
private EnumMap<HeaderTag, FitsValCom> tag2valcom = private EnumMap<HeaderTag, FitsValCom> tag2valcom = new EnumMap<HeaderTag, FitsValCom>(HeaderTag.class);
new EnumMap<HeaderTag, FitsValCom>(HeaderTag.class);
public FitsHeaderBuilder() { public FitsHeaderBuilder() {
@@ -380,7 +356,6 @@ public class FitsHeader {
tag2valcom.put(HeaderTag.MAP_NAME, mapName); tag2valcom.put(HeaderTag.MAP_NAME, mapName);
tag2valcom.put(HeaderTag.MAP_TYPE, mapType); tag2valcom.put(HeaderTag.MAP_TYPE, mapType);
tag2valcom.put(HeaderTag.MAP_VER, mapVer); tag2valcom.put(HeaderTag.MAP_VER, mapVer);
} }
public FitsHeaderBuilder setTarget(String val, String comment) { public FitsHeaderBuilder setTarget(String val, String comment) {
@@ -448,7 +423,8 @@ public class FitsHeader {
hdrTag = HeaderTag.valueOf(headerCard.getKey()); hdrTag = HeaderTag.valueOf(headerCard.getKey());
setVCbyHeaderTag(hdrTag, headerCard.getValue(), headerCard.getComment()); setVCbyHeaderTag(hdrTag, headerCard.getValue(), headerCard.getComment());
} catch (IllegalArgumentException e) { } catch (IllegalArgumentException e) {
if ((headerCard.getKey().contains("COMMENT")) || (headerCard.getKey().contains("PLANE"))) { if ((headerCard.getKey().contains("COMMENT"))
|| (headerCard.getKey().contains("PLANE"))) {
} else { } else {
System.out.println(headerCard.getKey() + " not a HeaderTag"); System.out.println(headerCard.getKey() + " not a HeaderTag");
} }
@@ -457,14 +433,12 @@ public class FitsHeader {
} }
return this; return this;
} }
public FitsHeader build() { public FitsHeader build() {
return new FitsHeader(this); return new FitsHeader(this);
} }
} }
/** /**
@@ -506,7 +480,6 @@ public class FitsHeader {
+ fitsFile.toString() + " for fits header!"; + fitsFile.toString() + " for fits header!";
System.err.println(errmesg); System.err.println(errmesg);
System.exit(1); System.exit(1);
} }
return hdrBuilder; return hdrBuilder;
} }
@@ -533,8 +506,8 @@ public class FitsHeader {
} }
if (hdrBuilder == null) { if (hdrBuilder == null) {
System.out.println("builder passed to FitsHeader.configHdrBuilder() is null. Generating" System.out.println(
+ " new FitsHeaderBuilder"); "builder passed to FitsHeader.configHdrBuilder() is null. Generating" + " new FitsHeaderBuilder");
hdrBuilder = new FitsHeaderBuilder(); hdrBuilder = new FitsHeaderBuilder();
} }
List<String> content = FileUtils.readLines(new File(configFile), Charset.defaultCharset()); List<String> content = FileUtils.readLines(new File(configFile), Charset.defaultCharset());
@@ -560,8 +533,8 @@ public class FitsHeader {
// user explicitly wants to override any comment in this header with null // user explicitly wants to override any comment in this header with null
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], null); hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], null);
} else { } else {
System.out.println("setting " + thisTag.toString() + " to " + keyval[1] System.out.println(
+ ", comment to " + keyval[2]); "setting " + thisTag.toString() + " to " + keyval[1] + ", comment to " + keyval[2]);
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], keyval[2]); hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], keyval[2]);
} }
} else { } else {
@@ -570,7 +543,6 @@ public class FitsHeader {
System.out.println(line); System.out.println(line);
System.out.println("Cannot parse. skipping this line"); System.out.println("Cannot parse. skipping this line");
} }
} }
} }
} }
@@ -609,7 +581,6 @@ public class FitsHeader {
int returnType = 0; int returnType = 0;
switch (tag) { switch (tag) {
case PXPERDEG: case PXPERDEG:
case CLON: case CLON:
case CLAT: case CLAT:
@@ -665,15 +636,11 @@ public class FitsHeader {
return new HeaderCard(tag.toString(), valcom.getV(), valcom.getC()); return new HeaderCard(tag.toString(), valcom.getV(), valcom.getC());
case 1: case 1:
return new HeaderCard(tag.toString(), StringUtil.str2fmtD(fmtS, valcom.getV()), return new HeaderCard(tag.toString(), StringUtil.str2fmtD(fmtS, valcom.getV()), valcom.getC());
valcom.getC());
case 2: case 2:
return new HeaderCard(tag.toString(), StringUtil.parseSafeD(valcom.getV()), return new HeaderCard(tag.toString(), StringUtil.parseSafeD(valcom.getV()), valcom.getC());
valcom.getC());
} }
} }
String errMesg = "ERROR!, cannot find fits keyword:" + tag.toString(); String errMesg = "ERROR!, cannot find fits keyword:" + tag.toString();
@@ -702,5 +669,4 @@ public class FitsHeader {
} }
return hdrBuilder; return hdrBuilder;
} }
} }

View File

@@ -42,11 +42,9 @@ public class FitsHeaderFactory {
private static final String PLANE = "PLANE"; private static final String PLANE = "PLANE";
private static final String COMMENT = "COMMENT"; private static final String COMMENT = "COMMENT";
public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) { public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
switch (headerType) { switch (headerType) {
case NFTMLN: case NFTMLN:
return new NFTmln(fitsHdr); return new NFTmln(fitsHdr);
@@ -65,7 +63,6 @@ public class FitsHeaderFactory {
default: default:
return null; return null;
} }
} }
public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) { public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
@@ -89,10 +86,8 @@ public class FitsHeaderFactory {
default: default:
return null; return null;
} }
} }
/** /**
* Fits Header block that contains information about the fits header itself. Ex. Header version * Fits Header block that contains information about the fits header itself. Ex. Header version
* number. * number.
@@ -108,7 +103,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS)); headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers; return headers;
} }
/** /**
@@ -129,7 +123,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN)); headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers; return headers;
} }
/** /**
@@ -151,7 +144,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE)); headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers; return headers;
} }
/** /**
@@ -184,7 +176,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER)); headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers; return headers;
} }
public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException { public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException {
@@ -203,8 +194,8 @@ public class FitsHeaderFactory {
// GSDI* // GSDI*
} }
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr, public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr, FitsHeaderType fitsHeaderType)
FitsHeaderType fitsHeaderType) throws HeaderCardException { throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>(); List<HeaderCard> headers = new ArrayList<HeaderCard>();
@@ -230,6 +221,5 @@ public class FitsHeaderFactory {
break; break;
} }
return headers; return headers;
} }
} }

View File

@@ -28,7 +28,6 @@ import java.util.ArrayList;
import java.util.LinkedHashMap; import java.util.LinkedHashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import nom.tam.fits.BasicHDU; import nom.tam.fits.BasicHDU;
import nom.tam.fits.Fits; import nom.tam.fits.Fits;
import nom.tam.fits.FitsException; import nom.tam.fits.FitsException;
@@ -37,6 +36,7 @@ import nom.tam.fits.HeaderCard;
import nom.tam.fits.TableHDU; import nom.tam.fits.TableHDU;
import nom.tam.util.BufferedFile; import nom.tam.util.BufferedFile;
import nom.tam.util.Cursor; import nom.tam.util.Cursor;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
/** /**
* Utility class containing generic static routines for working with FITS files. Refer to AltwgFits * Utility class containing generic static routines for working with FITS files. Refer to AltwgFits
@@ -50,8 +50,7 @@ public class FitsUtil {
// private static final String PLANE = "PLANE"; // private static final String PLANE = "PLANE";
// public static final String COMMENT = "COMMENT"; // public static final String COMMENT = "COMMENT";
public static double[][][] loadFits(String filename, int[] axes) public static double[][][] loadFits(String filename, int[] axes) throws FitsException, IOException {
throws FitsException, IOException {
Fits f = new Fits(filename); Fits f = new Fits(filename);
BasicHDU hdu = f.getHDU(0); BasicHDU hdu = f.getHDU(0);
int[] axes2 = hdu.getAxes(); int[] axes2 = hdu.getAxes();
@@ -87,7 +86,6 @@ public class FitsUtil {
int[] axes = hdu.getAxes(); int[] axes = hdu.getAxes();
f.close(); f.close();
return axes; return axes;
} }
/** /**
@@ -107,8 +105,7 @@ public class FitsUtil {
BasicHDU hdu = f.getHDU(0); BasicHDU hdu = f.getHDU(0);
axes = hdu.getAxes(); axes = hdu.getAxes();
if (axes.length != 3) { if (axes.length != 3) {
throw new IOException( throw new IOException("FITS file has incorrect dimensions! This was assumed to be 3D fits file!");
"FITS file has incorrect dimensions! This was assumed to be 3D fits file!");
} }
// fits axes specifies image dimensions in this order: [numPlanes][iSize][jSize] // fits axes specifies image dimensions in this order: [numPlanes][iSize][jSize]
@@ -162,8 +159,7 @@ public class FitsUtil {
for (int ii = 0; ii < fitsData[1].length; ii++) { for (int ii = 0; ii < fitsData[1].length; ii++) {
for (int jj = 0; jj < fitsData[0][1].length; jj++) { for (int jj = 0; jj < fitsData[0][1].length; jj++) {
Vector3D fits1P = Vector3D fits1P = new Vector3D(fitsData[4][ii][jj], fitsData[5][ii][jj], fitsData[6][ii][jj]);
new Vector3D(fitsData[4][ii][jj], fitsData[5][ii][jj], fitsData[6][ii][jj]);
// find angular separation between fits1 P vector and fits2 P vector // find angular separation between fits1 P vector and fits2 P vector
double radSep = Vector3D.angle(fits1P, xyzV); double radSep = Vector3D.angle(fits1P, xyzV);
@@ -194,7 +190,6 @@ public class FitsUtil {
public static boolean isGeneralHeaderKey(HeaderTag keyword) { public static boolean isGeneralHeaderKey(HeaderTag keyword) {
String name = keyword.name(); String name = keyword.name();
switch (name) { switch (name) {
case "SIMPLE": case "SIMPLE":
case "BITPIX": case "BITPIX":
case "NAXIS": case "NAXIS":
@@ -229,10 +224,8 @@ public class FitsUtil {
// Add any tags passed in as the third argument of this function // Add any tags passed in as the third argument of this function
if (newHeaderCards != null) { if (newHeaderCards != null) {
for (HeaderCard hc : newHeaderCards) { for (HeaderCard hc : newHeaderCards) {
if (hc.getKey().toUpperCase().equals("COMMENT")) if (hc.getKey().toUpperCase().equals("COMMENT")) hdu.getHeader().insertComment(hc.getComment());
hdu.getHeader().insertComment(hc.getComment()); else hdu.getHeader().addLine(hc);
else
hdu.getHeader().addLine(hc);
} }
} }
@@ -264,10 +257,8 @@ public class FitsUtil {
// Add any tags passed in as the third argument of this function // Add any tags passed in as the third argument of this function
if (newHeaderCards != null) { if (newHeaderCards != null) {
for (HeaderCard hc : newHeaderCards) { for (HeaderCard hc : newHeaderCards) {
if (hc.getKey().toUpperCase().equals("COMMENT")) if (hc.getKey().toUpperCase().equals("COMMENT")) hdu.getHeader().insertComment(hc.getComment());
hdu.getHeader().insertComment(hc.getComment()); else hdu.getHeader().addLine(hc);
else
hdu.getHeader().addLine(hc);
} }
} }
@@ -279,7 +270,6 @@ public class FitsUtil {
f.close(); f.close();
} }
/** /**
* Assuming data consists of multiple 2D planes with the following ordering: Note that this * Assuming data consists of multiple 2D planes with the following ordering: Note that this
* particular ordering is created so that the data is written properly to the fits file. * particular ordering is created so that the data is written properly to the fits file.
@@ -359,8 +349,9 @@ public class FitsUtil {
double lon = indata[1][i][j]; double lon = indata[1][i][j];
double rad = indata[2][i][j]; double rad = indata[2][i][j];
double[] pt = double[] pt = new Vector3D(Math.toRadians(lon), Math.toRadians(lat))
new Vector3D(Math.toRadians(lon), Math.toRadians(lat)).scalarMultiply(rad).toArray(); .scalarMultiply(rad)
.toArray();
outdata[0][i][j] = indata[0][i][j]; outdata[0][i][j] = indata[0][i][j];
outdata[1][i][j] = indata[1][i][j]; outdata[1][i][j] = indata[1][i][j];
@@ -413,8 +404,7 @@ public class FitsUtil {
int numPlanes = indata.length; int numPlanes = indata.length;
int numRows = indata[0].length; int numRows = indata[0].length;
int numCols = indata[0][0].length; int numCols = indata[0][0].length;
double[][][] outdata = double[][][] outdata = new double[numPlanes][numRows - 2 * cropAmount][numCols - 2 * cropAmount];
new double[numPlanes][numRows - 2 * cropAmount][numCols - 2 * cropAmount];
for (int k = 0; k < numPlanes; ++k) for (int k = 0; k < numPlanes; ++k)
for (int i = 0; i < numRows - 2 * cropAmount; ++i) for (int i = 0; i < numRows - 2 * cropAmount; ++i)
for (int j = 0; j < numCols - 2 * cropAmount; ++j) { for (int j = 0; j < numCols - 2 * cropAmount; ++j) {
@@ -460,8 +450,7 @@ public class FitsUtil {
* @param newHeaderCard * @param newHeaderCard
* @return * @return
*/ */
public static List<HeaderCard> updateOrAppendCard(List<HeaderCard> headers, public static List<HeaderCard> updateOrAppendCard(List<HeaderCard> headers, HeaderCard newHeaderCard) {
HeaderCard newHeaderCard) {
List<HeaderCard> newHeaders = new ArrayList<HeaderCard>(); List<HeaderCard> newHeaders = new ArrayList<HeaderCard>();
@@ -481,10 +470,8 @@ public class FitsUtil {
} }
return newHeaders; return newHeaders;
} }
/** /**
* Return the fits header as List&lt;HeaderCard&gt;. Each HeaderCard is equivalent to the FITS * Return the fits header as List&lt;HeaderCard&gt;. Each HeaderCard is equivalent to the FITS
* Keyword = Value, comment line in a fits header. * Keyword = Value, comment line in a fits header.
@@ -520,8 +507,8 @@ public class FitsUtil {
* @return * @return
* @throws FitsException * @throws FitsException
*/ */
public static HeaderCard getCard(Map<String, HeaderCard> map, String searchKey, public static HeaderCard getCard(Map<String, HeaderCard> map, String searchKey, boolean failOnNull)
boolean failOnNull) throws FitsException { throws FitsException {
HeaderCard returnCard = null; HeaderCard returnCard = null;
if (map.containsKey(searchKey)) { if (map.containsKey(searchKey)) {
returnCard = map.get(searchKey); returnCard = map.get(searchKey);
@@ -545,8 +532,7 @@ public class FitsUtil {
* @throws FitsException * @throws FitsException
* @throws IOException * @throws IOException
*/ */
public static Map<String, HeaderCard> getFitsHeaderAsMap(String fitsFile) public static Map<String, HeaderCard> getFitsHeaderAsMap(String fitsFile) throws FitsException, IOException {
throws FitsException, IOException {
Fits inf = new Fits(fitsFile); Fits inf = new Fits(fitsFile);
BasicHDU inHdu = inf.getHDU(0); BasicHDU inHdu = inf.getHDU(0);

View File

@@ -57,5 +57,4 @@ public class FitsValCom {
setV(newVal); setV(newVal);
setC(newComment); setC(newComment);
} }
} }

View File

@@ -44,7 +44,6 @@ public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC); super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC);
} }
/** /**
* Build fits header portion that contains the spatial information of the Generic Anci Global * Build fits header portion that contains the spatial information of the Generic Anci Global
* product. Overrides the default implementation in AnciTableFits. * product. Overrides the default implementation in AnciTableFits.
@@ -64,5 +63,4 @@ public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
return headers; return headers;
} }
} }

View File

@@ -39,6 +39,4 @@ public class GenericAnciLocal extends AnciTableFits implements AnciFitsHeader {
public GenericAnciLocal(FitsHdr fitsHeader) { public GenericAnciLocal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC); super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC);
} }
} }

View File

@@ -37,5 +37,4 @@ public class GenericGlobalDTM extends DTMFits implements DTMHeader {
public GenericGlobalDTM(FitsHdr fitsHeader) { public GenericGlobalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC); super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC);
} }
} }

View File

@@ -37,5 +37,4 @@ public class GenericLocalDTM extends DTMFits implements DTMHeader {
public GenericLocalDTM(FitsHdr fitsHeader) { public GenericLocalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC); super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC);
} }
} }

View File

@@ -41,81 +41,109 @@ public enum HeaderTag {
// that are not null. Some of the default values obviously need to be updated when actual fits // that are not null. Some of the default values obviously need to be updated when actual fits
// file // file
// is created. // is created.
SIMPLE("T", null), BITPIX(null, null), NAXIS("3", null), NAXIS1(null, null), NAXIS2(null, null), SIMPLE("T", null),
BITPIX(null, null),
NAXIS("3", null),
NAXIS1(null, null),
NAXIS2(null, null),
NAXIS3(null, null), EXTEND("T", null), HDRVERS("1.0.0", null), NPRDVERS("1.0.0", NAXIS3(null, null),
null), MISSION("OSIRIS-REx", "Name of mission"), HOSTNAME("OREX", "PDS ID"), EXTEND("T", null),
HDRVERS("1.0.0", null),
NPRDVERS("1.0.0", null),
MISSION("OSIRIS-REx", "Name of mission"),
HOSTNAME("OREX", "PDS ID"),
TARGET("101955 BENNU", "Target object"), ORIGIN("OREXSPOC", null), TARGET("101955 BENNU", "Target object"),
ORIGIN("OREXSPOC", null),
SPOC_ID("SPOCUPLOAD", null), SDPAREA("SPOCUPLOAD", null), SDPDESC("SPOCUPLOAD", null), SPOC_ID("SPOCUPLOAD", null),
SDPAREA("SPOCUPLOAD", null),
SDPDESC("SPOCUPLOAD", null),
MPHASE("FillMeIn", "Mission phase."), DATASRC("FillMeIn", MPHASE("FillMeIn", "Mission phase."),
"Shape model data source, i.e. 'SPC' or 'OLA'"), DATASRC("FillMeIn", "Shape model data source, i.e. 'SPC' or 'OLA'"),
DATASRCF("FillMeIn", "Source shape model data filename"), DATASRCV("FillMeIn", DATASRCF("FillMeIn", "Source shape model data filename"),
"Name and version of shape model"), DATASRCD("FillMeIn", DATASRCV("FillMeIn", "Name and version of shape model"),
"Creation date of shape model in UTC"), DATASRCS("N/A", DATASRCD("FillMeIn", "Creation date of shape model in UTC"),
"[m/pix] Shpe model plt scale"), SOFTWARE("FillMeIn", DATASRCS("N/A", "[m/pix] Shpe model plt scale"),
"Software used to create map data"), SOFTWARE("FillMeIn", "Software used to create map data"),
SOFT_VER("FillMeIn", "Version of software used to create map data"), SOFT_VER("FillMeIn", "Version of software used to create map data"),
DATEPRD("1701-10-09", "Date this product was produced in UTC"),
DATENPRD("1701-10-09", "Date this NFT product was produced in UTC"),
PRODNAME("FillMeIn", "Product filename"),
PRODVERS("1.0.0", "Product version number"),
DATEPRD("1701-10-09", "Date this product was produced in UTC"), DATENPRD("1701-10-09", MAP_VER("999", "Product version number."),
"Date this NFT product was produced in UTC"), PRODNAME("FillMeIn", CREATOR("ALT-pipeline", "Name of software that created this product"),
"Product filename"), PRODVERS("1.0.0", "Product version number"), AUTHOR("Espiritu", "Name of person that compiled this product"),
PROJECTN("Equirectangular", "Simple cylindrical projection"),
CLON("-999", "[deg] longitude at center of image"),
CLAT("-999", "[deg] latitude at center of image"),
MINLON(null, "[deg] minimum longitude of global DTM"),
MAXLON(null, "[deg] maximum longitude of global DTM"),
MINLAT(null, "[deg] minimum latitude of global DTM"),
MAXLAT(null, "[deg] maximum latitude of global DTM"),
MAP_VER("999", "Product version number."), CREATOR("ALT-pipeline", PXPERDEG("-999", "[pixel per degree] grid spacing of global map."),
"Name of software that created this product"), AUTHOR("Espiritu", LLCLAT("-999", "[deg]"),
"Name of person that compiled this product"), PROJECTN("Equirectangular", LLCLNG("-999", "[deg]"),
"Simple cylindrical projection"), CLON("-999", ULCLAT("-999", "[deg]"),
"[deg] longitude at center of image"), CLAT("-999", ULCLNG("-999", "[deg]"),
"[deg] latitude at center of image"), MINLON(null,
"[deg] minimum longitude of global DTM"), MAXLON(null,
"[deg] maximum longitude of global DTM"), MINLAT(null,
"[deg] minimum latitude of global DTM"), MAXLAT(null,
"[deg] maximum latitude of global DTM"),
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."), LLCLAT("-999", URCLAT("-999", "[deg]"),
"[deg]"), LLCLNG("-999", "[deg]"), ULCLAT("-999", "[deg]"), ULCLNG("-999", "[deg]"), URCLNG("-999", "[deg]"),
LRCLAT("-999", "[deg]"),
LRCLNG("-999", "[deg]"),
URCLAT("-999", "[deg]"), URCLNG("-999", "[deg]"), LRCLAT("-999", "[deg]"), LRCLNG("-999", CNTR_V_X("-999", "[km]"),
"[deg]"), CNTR_V_Y("-999", "[km]"),
CNTR_V_Z("-999", "[km]"),
UX_X("-999", "[m]"),
UX_Y("-999", "[m]"),
CNTR_V_X("-999", "[km]"), CNTR_V_Y("-999", "[km]"), CNTR_V_Z("-999", "[km]"), UX_X("-999", UX_Z("-999", "[m]"),
"[m]"), UX_Y("-999", "[m]"), UY_X("-999", "[m]"),
UY_Y("-999", "[m]"),
UY_Z("-999", "[m]"),
UZ_X("-999", "[m]"),
UX_Z("-999", "[m]"), UY_X("-999", "[m]"), UY_Y("-999", "[m]"), UY_Z("-999", "[m]"), UZ_X("-999", UZ_Y("-999", "/[m]"),
"[m]"), UZ_Z("-999", "[m]"),
GSD("-999", "[mm] grid spacing in units/pixel"),
GSDI("-999", "[unk] Ground sample distance integer"),
SIGMA("-999", "Global uncertainty of the data [m]"),
UZ_Y("-999", "/[m]"), UZ_Z("-999", "[m]"), GSD("-999", "[mm] grid spacing in units/pixel"), GSDI( SIG_DEF("Uncertainty", "SIGMA uncertainty metric"),
"-999", DQUAL_1("-999", "Data quality metric; incidence directions"),
"[unk] Ground sample distance integer"), SIGMA("-999", "Global uncertainty of the data [m]"),
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"), DQUAL_1("-999", DQUAL_2("-999", "Data quality metric; emission directions"),
"Data quality metric; incidence directions"), DSIG_DEF("UNK", "Defines uncertainty metric in ancillary file"),
END(null, null),
DQUAL_2("-999", "Data quality metric; emission directions"), DSIG_DEF("UNK",
"Defines uncertainty metric in ancillary file"), END(null, null),
// additional fits tags describing gravity derived values // additional fits tags describing gravity derived values
DENSITY("-999", "[kgm^-3] density of body"), ROT_RATE("-999", DENSITY("-999", "[kgm^-3] density of body"),
"[rad/sec] rotation rate of body"), REF_POT("-999", "[J/kg] reference potential of body"), ROT_RATE("-999", "[rad/sec] rotation rate of body"),
REF_POT("-999", "[J/kg] reference potential of body"),
// additional fits tags describing tilt derived values // additional fits tags describing tilt derived values
TILT_RAD("-999", "[m]"), TILT_MAJ("-999", TILT_RAD("-999", "[m]"),
"[m] semi-major axis of ellipse for tilt calcs"), TILT_MIN("-999", TILT_MAJ("-999", "[m] semi-major axis of ellipse for tilt calcs"),
"[m] semi-minor axis of ellipse for tilt calcs"), TILT_PA("-999", TILT_MIN("-999", "[m] semi-minor axis of ellipse for tilt calcs"),
"[deg] position angle of ellipse for tilt calcs"), TILT_PA("-999", "[deg] position angle of ellipse for tilt calcs"),
// Additional fits tags specific to ancillary fits files // Additional fits tags specific to ancillary fits files
MAP_NAME("FillMeIn", "Map data type"), MAP_TYPE("FillMeIn", MAP_NAME("FillMeIn", "Map data type"),
"Defines whether this is a global or local map"), OBJ_FILE("TEMPLATEENTRY", null), MAP_TYPE("FillMeIn", "Defines whether this is a global or local map"),
OBJ_FILE("TEMPLATEENTRY", null),
// keywords specific to facet mapping ancillary file // keywords specific to facet mapping ancillary file
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"), GSDINDX("-999", OBJINDX("UNK", "OBJ indexed to OBJ_FILE"),
"[unk] Ground sample distance of OBJINDX"), GSDINDXI("-999", "[unk] GSDINDX integer"), GSDINDX("-999", "[unk] Ground sample distance of OBJINDX"),
GSDINDXI("-999", "[unk] GSDINDX integer"),
// return this enum when no match is found // return this enum when no match is found
NOMATCH(null, "could not determine"); NOMATCH(null, "could not determine");
@@ -123,7 +151,6 @@ public enum HeaderTag {
// PIXPDEG(null,"pixels per degree","pixel/deg"), // PIXPDEG(null,"pixels per degree","pixel/deg"),
// PIX_SZ(null, "mean size of pixels at equator (meters)","m"); // PIX_SZ(null, "mean size of pixels at equator (meters)","m");
private FitsValCom fitsValCom; private FitsValCom fitsValCom;
private HeaderTag(String value, String comment) { private HeaderTag(String value, String comment) {
@@ -141,12 +168,9 @@ public enum HeaderTag {
/** /**
* Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword * Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword
*/ */
public static final EnumSet<HeaderTag> fitsKeywords = public static final EnumSet<HeaderTag> fitsKeywords = EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
public static final EnumSet<HeaderTag> globalDTMFitsData =
EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
public static final EnumSet<HeaderTag> globalDTMFitsData = EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
/** /**
* Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found. * Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found.

View File

@@ -85,7 +85,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS)); headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS));
return headers; return headers;
} }
/** /**
@@ -102,7 +101,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF)); headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
return headers; return headers;
} }
/** /**
@@ -121,7 +119,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER)); headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers; return headers;
} }
/** /**
@@ -150,7 +147,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2)); headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
return headers; return headers;
} }
/** /**
@@ -168,7 +164,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD)); headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD));
return headers; return headers;
} }
/** /**
@@ -185,7 +180,5 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR)); headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
return headers; return headers;
} }
} }

View File

@@ -34,12 +34,12 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.TimeZone; import java.util.TimeZone;
import org.apache.commons.io.FilenameUtils;
import nom.tam.fits.BasicHDU; import nom.tam.fits.BasicHDU;
import nom.tam.fits.Fits; import nom.tam.fits.Fits;
import nom.tam.fits.FitsException; import nom.tam.fits.FitsException;
import nom.tam.fits.HeaderCard; import nom.tam.fits.HeaderCard;
import nom.tam.util.Cursor; import nom.tam.util.Cursor;
import org.apache.commons.io.FilenameUtils;
import terrasaur.enums.FitsHeaderType; import terrasaur.enums.FitsHeaderType;
import terrasaur.enums.PlaneInfo; import terrasaur.enums.PlaneInfo;
import terrasaur.fits.FitsHdr.FitsHdrBuilder; import terrasaur.fits.FitsHdr.FitsHdrBuilder;
@@ -65,8 +65,7 @@ public class ProductFits {
* @throws FitsException * @throws FitsException
* @throws IOException * @throws IOException
*/ */
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile) public static List<HeaderCard> getPlaneHeaderCards(String fitsFile) throws FitsException, IOException {
throws FitsException, IOException {
Fits inFits = new Fits(fitsFile); Fits inFits = new Fits(fitsFile);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits); List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits);
return planeHeaders; return planeHeaders;
@@ -81,8 +80,7 @@ public class ProductFits {
* @throws FitsException * @throws FitsException
* @throws IOException * @throws IOException
*/ */
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile) public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile) throws FitsException, IOException {
throws FitsException, IOException {
BasicHDU inHdu = inFitsFile.getHDU(0); BasicHDU inHdu = inFitsFile.getHDU(0);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu); List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu);
return planeHeaders; return planeHeaders;
@@ -116,8 +114,7 @@ public class ProductFits {
* @throws IOException * @throws IOException
* @throws FitsException * @throws FitsException
*/ */
public static Map<String, Double> minMaxLLFromFits(File fitsFile) public static Map<String, Double> minMaxLLFromFits(File fitsFile) throws FitsException, IOException {
throws FitsException, IOException {
System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath()); System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath());
@@ -298,8 +295,7 @@ public class ProductFits {
// save outfile name in cross-reference file, for future reference // save outfile name in cross-reference file, for future reference
String path = FilenameUtils.getFullPath(outFname); String path = FilenameUtils.getFullPath(outFname);
if (path.length() == 0) path = "."; if (path.length() == 0) path = ".";
String outBaseName = String outBaseName = String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath()); AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath());
crfFile.streamSToFile(outBaseName, 0); crfFile.streamSToFile(outBaseName, 0);
crfFile.closeFile(); crfFile.closeFile();
@@ -327,10 +323,7 @@ public class ProductFits {
* @throws FitsException * @throws FitsException
*/ */
public static void saveDataCubeFits( public static void saveDataCubeFits(
double[][][] dataCube, double[][][] dataCube, FitsHdrBuilder hdrBuilder, List<HeaderCard> planeHeaders, String outFile)
FitsHdrBuilder hdrBuilder,
List<HeaderCard> planeHeaders,
String outFile)
throws FitsException, IOException { throws FitsException, IOException {
FitsHdr fitsHdr = hdrBuilder.build(); FitsHdr fitsHdr = hdrBuilder.build();
@@ -357,10 +350,7 @@ public class ProductFits {
* @throws FitsException * @throws FitsException
*/ */
public static void saveDataCubeFits( public static void saveDataCubeFits(
double[][][] dataCube, double[][][] dataCube, List<HeaderCard> headers, List<HeaderCard> planeKeywords, String outFname)
List<HeaderCard> headers,
List<HeaderCard> planeKeywords,
String outFname)
throws FitsException, IOException { throws FitsException, IOException {
// append planeHeaders // append planeHeaders
@@ -414,8 +404,7 @@ public class ProductFits {
DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN); DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN);
nftFitsHeader.setData(fitsData); nftFitsHeader.setData(fitsData);
List<HeaderCard> headers = List<HeaderCard> headers = nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
System.out.println("Saving to " + outNftFile); System.out.println("Saving to " + outNftFile);
FitsUtil.saveFits(fitsData.getData(), outNftFile, headers); FitsUtil.saveFits(fitsData.getData(), outNftFile, headers);

View File

@@ -23,7 +23,6 @@
package terrasaur.fits; package terrasaur.fits;
public enum UnitDir { public enum UnitDir {
UX { UX {
public int getAxis() { public int getAxis() {
return 1; return 1;

View File

@@ -36,9 +36,9 @@ import javafx.scene.control.ChoiceBox;
import javafx.scene.control.Label; import javafx.scene.control.Label;
import javafx.scene.control.TextField; import javafx.scene.control.TextField;
import javafx.stage.Stage; import javafx.stage.Stage;
import spice.basic.SpiceException;
import terrasaur.apps.TranslateTime; import terrasaur.apps.TranslateTime;
import terrasaur.utils.AppVersion; import terrasaur.utils.AppVersion;
import spice.basic.SpiceException;
public class TranslateTimeController implements Initializable { public class TranslateTimeController implements Initializable {
@@ -56,16 +56,12 @@ public class TranslateTimeController implements Initializable {
// populate SCLK menu // populate SCLK menu
NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs); NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs);
this.sclkChoice.getItems().addAll(sclkIDs); this.sclkChoice.getItems().addAll(sclkIDs);
this.sclkChoice.getSelectionModel().selectedIndexProperty() this.sclkChoice.getSelectionModel().selectedIndexProperty().addListener(new ChangeListener<Number>() {
.addListener(new ChangeListener<Number>() {
@Override @Override
public void changed(ObservableValue<? extends Number> observable, Number oldValue, public void changed(ObservableValue<? extends Number> observable, Number oldValue, Number newValue) {
Number newValue) {
tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue)); tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue));
} }
}); });
this.sclkChoice.getSelectionModel().select(0); this.sclkChoice.getSelectionModel().select(0);
@@ -90,8 +86,7 @@ public class TranslateTimeController implements Initializable {
@FXML @FXML
private void setJulian() throws NumberFormatException, SpiceException { private void setJulian() throws NumberFormatException, SpiceException {
if (julianString.getText().trim().length() > 0) if (julianString.getText().trim().length() > 0) tt.setJulianDate(Double.parseDouble(julianString.getText()));
tt.setJulianDate(Double.parseDouble(julianString.getText()));
updateTime(); updateTime();
} }
@@ -103,8 +98,7 @@ public class TranslateTimeController implements Initializable {
@FXML @FXML
private void setSCLK() throws SpiceException { private void setSCLK() throws SpiceException {
if (sclkString.getText().trim().length() > 0) if (sclkString.getText().trim().length() > 0) tt.setSCLK(sclkString.getText());
tt.setSCLK(sclkString.getText());
updateTime(); updateTime();
} }
@@ -113,8 +107,7 @@ public class TranslateTimeController implements Initializable {
@FXML @FXML
private void setTDB() throws SpiceException { private void setTDB() throws SpiceException {
if (tdbString.getText().trim().length() > 0) if (tdbString.getText().trim().length() > 0) tt.setTDB(Double.parseDouble(tdbString.getText()));
tt.setTDB(Double.parseDouble(tdbString.getText()));
updateTime(); updateTime();
} }
@@ -123,8 +116,7 @@ public class TranslateTimeController implements Initializable {
@FXML @FXML
private void setTDBCalendar() throws SpiceException { private void setTDBCalendar() throws SpiceException {
if (tdbCalendarString.getText().trim().length() > 0) if (tdbCalendarString.getText().trim().length() > 0) tt.setTDBCalendarString(tdbCalendarString.getText());
tt.setTDBCalendarString(tdbCalendarString.getText());
updateTime(); updateTime();
} }
@@ -136,8 +128,7 @@ public class TranslateTimeController implements Initializable {
@FXML @FXML
private void setUTC() throws SpiceException { private void setUTC() throws SpiceException {
if (utcString.getText().trim().length() > 0) if (utcString.getText().trim().length() > 0) tt.setUTC(utcString.getText());
tt.setUTC(utcString.getText());
updateTime(); updateTime();
} }
@@ -149,5 +140,4 @@ public class TranslateTimeController implements Initializable {
utcString.setText(tt.toTDB().toUTCString("ISOC", 3)); utcString.setText(tt.toTDB().toUTCString("ISOC", 3));
utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY"))); utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY")));
} }
} }

View File

@@ -62,5 +62,4 @@ public class TranslateTimeFX extends Application {
stage.setScene(scene); stage.setScene(scene);
stage.show(); stage.show();
} }
} }

View File

@@ -22,7 +22,6 @@
*/ */
package terrasaur.smallBodyModel; package terrasaur.smallBodyModel;
import picante.math.intervals.Interval; import picante.math.intervals.Interval;
import picante.math.intervals.UnwritableInterval; import picante.math.intervals.UnwritableInterval;
import picante.math.vectorspace.UnwritableVectorIJK; import picante.math.vectorspace.UnwritableVectorIJK;
@@ -55,7 +54,9 @@ public class BoundingBox {
* @param bounds * @param bounds
*/ */
public BoundingBox(double[] bounds) { public BoundingBox(double[] bounds) {
this(new Interval(bounds[0], bounds[1]), new Interval(bounds[2], bounds[3]), this(
new Interval(bounds[0], bounds[1]),
new Interval(bounds[2], bounds[3]),
new Interval(bounds[4], bounds[5])); new Interval(bounds[4], bounds[5]));
} }
@@ -66,8 +67,7 @@ public class BoundingBox {
* @param yRange * @param yRange
* @param zRange * @param zRange
*/ */
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange, public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange, UnwritableInterval zRange) {
UnwritableInterval zRange) {
this.xRange = new Interval(xRange); this.xRange = new Interval(xRange);
this.yRange = new Interval(yRange); this.yRange = new Interval(yRange);
this.zRange = new Interval(zRange); this.zRange = new Interval(zRange);
@@ -145,7 +145,8 @@ public class BoundingBox {
* @return * @return
*/ */
public boolean intersects(BoundingBox other) { public boolean intersects(BoundingBox other) {
return xRange.closedIntersects(other.xRange) && yRange.closedIntersects(other.yRange) return xRange.closedIntersects(other.xRange)
&& yRange.closedIntersects(other.yRange)
&& zRange.closedIntersects(other.zRange); && zRange.closedIntersects(other.zRange);
} }
@@ -179,8 +180,7 @@ public class BoundingBox {
* @return * @return
*/ */
public boolean contains(UnwritableVectorIJK pt) { public boolean contains(UnwritableVectorIJK pt) {
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ()) return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ()) && zRange.closedContains(pt.getK());
&& zRange.closedContains(pt.getK());
} }
/** /**

View File

@@ -22,6 +22,7 @@
*/ */
package terrasaur.smallBodyModel; package terrasaur.smallBodyModel;
import com.google.common.collect.HashMultimap;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
@@ -34,11 +35,10 @@ import org.apache.commons.math3.geometry.euclidean.threed.Rotation;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D; import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import com.google.common.collect.HashMultimap;
import picante.math.vectorspace.VectorIJK; import picante.math.vectorspace.VectorIJK;
import terrasaur.utils.math.MathConversions;
import terrasaur.utils.PolyDataStatistics; import terrasaur.utils.PolyDataStatistics;
import terrasaur.utils.PolyDataUtil; import terrasaur.utils.PolyDataUtil;
import terrasaur.utils.math.MathConversions;
import terrasaur.utils.tessellation.FibonacciSphere; import terrasaur.utils.tessellation.FibonacciSphere;
import vtk.vtkPoints; import vtk.vtkPoints;
import vtk.vtkPolyData; import vtk.vtkPolyData;
@@ -51,7 +51,7 @@ import vtk.vtkPolyData;
*/ */
public class LocalModelCollection { public class LocalModelCollection {
private final static Logger logger = LogManager.getLogger(); private static final Logger logger = LogManager.getLogger();
static class LocalModel { static class LocalModel {
final Vector3D center; final Vector3D center;
@@ -61,7 +61,6 @@ public class LocalModelCollection {
this.center = center; this.center = center;
this.filename = filename; this.filename = filename;
} }
} }
// key is tile index, value is collection of localModels // key is tile index, value is collection of localModels
@@ -109,19 +108,18 @@ public class LocalModelCollection {
*/ */
public SmallBodyModel get(Vector3D point) { public SmallBodyModel get(Vector3D point) {
List<String> filenames = getFilenames(point); List<String> filenames = getFilenames(point);
if (filenames.size() == 0) if (filenames.size() == 0) logger.error("No shape models cover {}", point.toString());
logger.error("No shape models cover {}", point.toString());
double[] origin = new double[3]; double[] origin = new double[3];
double[] intersectPoint = new double[3]; double[] intersectPoint = new double[3];
for (String filename : filenames) { for (String filename : filenames) {
SmallBodyModel sbm = load(filename); SmallBodyModel sbm = load(filename);
long intersect = long intersect = sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
if (intersect != -1) if (intersect != -1) return sbm;
return sbm;
} }
logger.debug("Failed intersection for lon {}, lat {}", Math.toDegrees(point.getAlpha()), logger.debug(
"Failed intersection for lon {}, lat {}",
Math.toDegrees(point.getAlpha()),
Math.toDegrees(point.getDelta())); Math.toDegrees(point.getDelta()));
return null; return null;
} }
@@ -141,8 +139,7 @@ public class LocalModelCollection {
SmallBodyModel sbm = map.get(filename); SmallBodyModel sbm = map.get(filename);
if (sbm == null) { if (sbm == null) {
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(), logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(), FilenameUtils.getBaseName(filename));
FilenameUtils.getBaseName(filename));
try { try {
vtkPolyData model = PolyDataUtil.loadShapeModel(filename); vtkPolyData model = PolyDataUtil.loadShapeModel(filename);
if (scale != null || rotation != null) { if (scale != null || rotation != null) {
@@ -153,9 +150,13 @@ public class LocalModelCollection {
for (int i = 0; i < points.GetNumberOfPoints(); i++) { for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3D thisPoint = new Vector3D(points.GetPoint(i)); Vector3D thisPoint = new Vector3D(points.GetPoint(i));
if (scale != null) if (scale != null)
thisPoint = thisPoint.subtract(center).scalarMultiply(scale).add(center); thisPoint = thisPoint
.subtract(center)
.scalarMultiply(scale)
.add(center);
if (rotation != null) if (rotation != null)
thisPoint = rotation.applyTo(thisPoint.subtract(center)).add(center); thisPoint =
rotation.applyTo(thisPoint.subtract(center)).add(center);
points.SetPoint(i, thisPoint.toArray()); points.SetPoint(i, thisPoint.toArray());
} }
} }
@@ -195,7 +196,8 @@ public class LocalModelCollection {
localDistanceMap.put(thisDist, localModel); localDistanceMap.put(thisDist, localModel);
} }
// add all local models with centers within PI/4 of point // add all local models with centers within PI/4 of point
for (double localDist : localDistanceMap.headMap(Math.PI / 4, true).keySet()) { for (double localDist :
localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
smallBodyModels.add(localDistanceMap.get(localDist).filename); smallBodyModels.add(localDistanceMap.get(localDist).filename);
} }
} }
@@ -203,5 +205,4 @@ public class LocalModelCollection {
return smallBodyModels; return smallBodyModels;
} }
} }

View File

@@ -30,34 +30,34 @@ import terrasaur.smallBodyModel.ImmutableSBMTStructure.Builder;
/** /**
* *
* <pre> * <pre>
# SBMT Structure File * # SBMT Structure File
# type,point * # type,point
# ------------------------------------------------------------------------------ * # ------------------------------------------------------------------------------
# File consists of a list of structures on each line. * # File consists of a list of structures on each line.
# * #
# Each line is defined by 17 columns with the following: * # Each line is defined by 17 columns with the following:
# &lt;id&gt; &lt;name&gt; &lt;centerXYZ[3]&gt; &lt;centerLLR[3]&gt; &lt;coloringValue[4]&gt; &lt;diameter&gt; &lt;flattening&gt; &lt;regularAngle&gt; &lt;colorRGB&gt; &lt;label&gt;* * # &lt;id&gt; &lt;name&gt; &lt;centerXYZ[3]&gt; &lt;centerLLR[3]&gt; &lt;coloringValue[4]&gt; &lt;diameter&gt; &lt;flattening&gt; &lt;regularAngle&gt; &lt;colorRGB&gt; &lt;label&gt;*
# * #
# id: Id of the structure * # id: Id of the structure
# name: Name of the structure * # name: Name of the structure
# centerXYZ[3]: 3 columns that define the structure center in 3D space * # centerXYZ[3]: 3 columns that define the structure center in 3D space
# centerLLR[3]: 3 columns that define the structure center in lat,lon,radius * # centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
# coloringValue[4]: 4 columns that define the ellipse &#8220;standard&#8221; colorings. The * # coloringValue[4]: 4 columns that define the ellipse &#8220;standard&#8221; colorings. The
# colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA) * # colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
# diameter: Diameter of (semimajor) axis of ellipse * # diameter: Diameter of (semimajor) axis of ellipse
# flattening: Flattening factor of ellipse. Range: [0.0, 1.0] * # flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
# regularAngle: Angle between the semimajor axis and the line of longitude * # regularAngle: Angle between the semimajor axis and the line of longitude
# as projected onto the surface * # as projected onto the surface
# colorRGB: 1 column (of RGB values [0, 255] separated by commas with no * # colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
# spaces). This column appears as a single textual column. * # spaces). This column appears as a single textual column.
# label: Label of the structure * # label: Label of the structure
# * #
# * #
# Please note the following: * # Please note the following:
# - Each line is composed of columns separated by a tab character. * # - Each line is composed of columns separated by a tab character.
# - Blank lines or lines that start with '#' are ignored. * # - Blank lines or lines that start with '#' are ignored.
# - Angle units: degrees * # - Angle units: degrees
# - Length units: kilometers * # - Length units: kilometers
* </pre> * </pre>
* *
* @author Hari.Nair@jhuapl.edu * @author Hari.Nair@jhuapl.edu
@@ -117,8 +117,8 @@ public abstract class SBMTStructure {
String[] parts = line.split("\\s+"); String[] parts = line.split("\\s+");
int id = Integer.parseInt(parts[0]); int id = Integer.parseInt(parts[0]);
String name = parts[1]; String name = parts[1];
Vector3D centerXYZ = new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]), Vector3D centerXYZ =
Double.parseDouble(parts[4])); new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]), Double.parseDouble(parts[4]));
String slopeColoring = parts[8]; String slopeColoring = parts[8];
String elevationColoring = parts[9]; String elevationColoring = parts[9];
String accelerationColoring = parts[10]; String accelerationColoring = parts[10];
@@ -127,8 +127,8 @@ public abstract class SBMTStructure {
double flattening = Double.parseDouble(parts[13]); double flattening = Double.parseDouble(parts[13]);
double regularAngle = Double.parseDouble(parts[14]); double regularAngle = Double.parseDouble(parts[14]);
String[] colorParts = parts[15].split(","); String[] colorParts = parts[15].split(",");
Color rgb = new Color(Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]), Color rgb = new Color(
Integer.parseInt(colorParts[2])); Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]), Integer.parseInt(colorParts[2]));
String label = parts[16]; String label = parts[16];
Builder builder = ImmutableSBMTStructure.builder(); Builder builder = ImmutableSBMTStructure.builder();
@@ -146,6 +146,4 @@ public abstract class SBMTStructure {
builder.label(label); builder.label(label);
return builder.build(); return builder.build();
} }
} }

View File

@@ -39,7 +39,7 @@ import vtk.vtkPolyData;
*/ */
public class SmallBodyCubes { public class SmallBodyCubes {
private final static Logger logger = LogManager.getLogger(SmallBodyCubes.class); private static final Logger logger = LogManager.getLogger(SmallBodyCubes.class);
private BoundingBox boundingBox; private BoundingBox boundingBox;
private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>(); private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>();
@@ -55,21 +55,20 @@ public class SmallBodyCubes {
* @param cubeSize * @param cubeSize
* @param buffer * @param buffer
*/ */
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer, public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer, boolean removeEmptyCubes) {
boolean removeEmptyCubes) {
this.cubeSize = cubeSize; this.cubeSize = cubeSize;
this.buffer = buffer; this.buffer = buffer;
initialize(smallBodyPolyData); initialize(smallBodyPolyData);
if (removeEmptyCubes) if (removeEmptyCubes) removeEmptyCubes(smallBodyPolyData);
removeEmptyCubes(smallBodyPolyData);
} }
private void initialize(vtkPolyData smallBodyPolyData) { private void initialize(vtkPolyData smallBodyPolyData) {
smallBodyPolyData.ComputeBounds(); smallBodyPolyData.ComputeBounds();
double[] bounds = smallBodyPolyData.GetBounds(); double[] bounds = smallBodyPolyData.GetBounds();
boundingBox = new BoundingBox(new Interval(bounds[0] - buffer, bounds[1] + buffer), boundingBox = new BoundingBox(
new Interval(bounds[0] - buffer, bounds[1] + buffer),
new Interval(bounds[2] - buffer, bounds[3] + buffer), new Interval(bounds[2] - buffer, bounds[3] + buffer),
new Interval(bounds[4] - buffer, bounds[5] + buffer)); new Interval(bounds[4] - buffer, bounds[5] + buffer));
@@ -86,8 +85,8 @@ public class SmallBodyCubes {
for (int i = 0; i < numCubesX; ++i) { for (int i = 0; i < numCubesX; ++i) {
double xmin = boundingBox.getXRange().getBegin() + i * cubeSize; double xmin = boundingBox.getXRange().getBegin() + i * cubeSize;
double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize; double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize;
BoundingBox bb = new BoundingBox(new Interval(xmin, xmax), new Interval(ymin, ymax), BoundingBox bb = new BoundingBox(
new Interval(zmin, zmax)); new Interval(xmin, xmax), new Interval(ymin, ymax), new Interval(zmin, zmax));
allCubes.add(bb); allCubes.add(bb);
} }
} }
@@ -188,14 +187,12 @@ public class SmallBodyCubes {
* @return * @return
*/ */
public int getCubeId(double[] point) { public int getCubeId(double[] point) {
if (!boundingBox.contains(point)) if (!boundingBox.contains(point)) return -1;
return -1;
int numberCubes = allCubes.size(); int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) { for (int i = 0; i < numberCubes; ++i) {
BoundingBox cube = getCube(i); BoundingBox cube = getCube(i);
if (cube.contains(point)) if (cube.contains(point)) return i;
return i;
} }
// If we reach here something is wrong // If we reach here something is wrong

View File

@@ -28,7 +28,6 @@ import java.util.ArrayList;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
import java.util.TreeSet; import java.util.TreeSet;
import org.apache.commons.math3.geometry.euclidean.threed.Rotation; import org.apache.commons.math3.geometry.euclidean.threed.Rotation;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D; import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.commons.math3.stat.descriptive.DescriptiveStatistics; import org.apache.commons.math3.stat.descriptive.DescriptiveStatistics;
@@ -65,10 +64,11 @@ import vtk.vtksbModifiedBSPTree;
*/ */
public class SmallBodyModel { public class SmallBodyModel {
private final static Logger logger = LogManager.getLogger(SmallBodyModel.class); private static final Logger logger = LogManager.getLogger(SmallBodyModel.class);
public enum ColoringValueType { public enum ColoringValueType {
POINT_DATA, CELLDATA POINT_DATA,
CELLDATA
} }
private vtkPolyData smallBodyPolyData; private vtkPolyData smallBodyPolyData;
@@ -114,8 +114,12 @@ public class SmallBodyModel {
setSmallBodyPolyData(polyData, coloringValues, coloringNames, coloringUnits, coloringValueType); setSmallBodyPolyData(polyData, coloringValues, coloringNames, coloringUnits, coloringValueType);
} }
public void setSmallBodyPolyData(vtkPolyData polydata, vtkFloatArray[] coloringValues, public void setSmallBodyPolyData(
String[] coloringNames, String[] coloringUnits, ColoringValueType coloringValueType) { vtkPolyData polydata,
vtkFloatArray[] coloringValues,
String[] coloringNames,
String[] coloringUnits,
ColoringValueType coloringValueType) {
smallBodyPolyData.DeepCopy(polydata); smallBodyPolyData.DeepCopy(polydata);
smallBodyPolyData.BuildLinks(0); smallBodyPolyData.BuildLinks(0);
@@ -201,11 +205,9 @@ public class SmallBodyModel {
// comes out to 1 km for Eros. // comes out to 1 km for Eros.
// Compute bounding box diagonal length of lowest res shape model // Compute bounding box diagonal length of lowest res shape model
double diagonalLength = double diagonalLength = new BoundingBox(getLowResSmallBodyPolyData().GetBounds()).getDiagonalLength();
new BoundingBox(getLowResSmallBodyPolyData().GetBounds()).getDiagonalLength();
double cubeSize = diagonalLength / 38.66056033363347; double cubeSize = diagonalLength / 38.66056033363347;
smallBodyCubes = smallBodyCubes = new SmallBodyCubes(getLowResSmallBodyPolyData(), cubeSize, 0.01 * cubeSize, true);
new SmallBodyCubes(getLowResSmallBodyPolyData(), cubeSize, 0.01 * cubeSize, true);
} }
return smallBodyCubes; return smallBodyCubes;
@@ -274,8 +276,7 @@ public class SmallBodyModel {
* @return * @return
*/ */
public double[] getNormalAtPoint(double[] point, double radius) { public double[] getNormalAtPoint(double[] point, double radius) {
return PolyDataUtil.getPolyDataNormalAtPointWithinRadius(point, smallBodyPolyData, pointLocator, return PolyDataUtil.getPolyDataNormalAtPointWithinRadius(point, smallBodyPolyData, pointLocator, radius);
radius);
} }
public double[] getClosestNormal(double[] point) { public double[] getClosestNormal(double[] point) {
@@ -402,8 +403,7 @@ public class SmallBodyModel {
double[] origin = {0.0, 0.0, 0.0}; double[] origin = {0.0, 0.0, 0.0};
double[] lookPt = {rect.getI(), rect.getJ(), rect.getK()}; double[] lookPt = {rect.getI(), rect.getJ(), rect.getK()};
return computeRayIntersection(origin, lookPt, 10 * getBoundingBoxDiagonalLength(), return computeRayIntersection(origin, lookPt, 10 * getBoundingBoxDiagonalLength(), intersectPoint);
intersectPoint);
} }
/** /**
@@ -428,11 +428,9 @@ public class SmallBodyModel {
data[0][m][n] = lat; data[0][m][n] = lat;
data[1][m][n] = lon; data[1][m][n] = lon;
long cellId = long cellId = getPointAndCellIdFromLatLon(Math.toRadians(lat), Math.toRadians(lon), intersectPoint);
getPointAndCellIdFromLatLon(Math.toRadians(lat), Math.toRadians(lon), intersectPoint);
double rad = -1.0e32; double rad = -1.0e32;
if (cellId >= 0) if (cellId >= 0) rad = new VectorIJK(intersectPoint).getLength();
rad = new VectorIJK(intersectPoint).getLength();
else { else {
logger.info(String.format("Warning: no intersection at lat:%.5f, lon:%.5f", lat, lon)); logger.info(String.format("Warning: no intersection at lat:%.5f, lon:%.5f", lat, lon));
} }
@@ -472,8 +470,7 @@ public class SmallBodyModel {
* @param intersectPoint (returned) * @param intersectPoint (returned)
* @return the cellId of the cell containing the intersect point or -1 if no intersection * @return the cellId of the cell containing the intersect point or -1 if no intersection
*/ */
public long computeRayIntersection(double[] origin, double[] direction, double distance, public long computeRayIntersection(double[] origin, double[] direction, double distance, double[] intersectPoint) {
double[] intersectPoint) {
double[] lookPt = new double[3]; double[] lookPt = new double[3];
lookPt[0] = origin[0] + 2.0 * distance * direction[0]; lookPt[0] = origin[0] + 2.0 * distance * direction[0];
@@ -487,17 +484,14 @@ public class SmallBodyModel {
int[] subId = new int[1]; int[] subId = new int[1];
long[] cellId = new long[1]; long[] cellId = new long[1];
int result = cellLocator.IntersectWithLine(origin, lookPt, tol, t, x, pcoords, subId, cellId, int result = cellLocator.IntersectWithLine(origin, lookPt, tol, t, x, pcoords, subId, cellId, genericCell);
genericCell);
intersectPoint[0] = x[0]; intersectPoint[0] = x[0];
intersectPoint[1] = x[1]; intersectPoint[1] = x[1];
intersectPoint[2] = x[2]; intersectPoint[2] = x[2];
if (result > 0) if (result > 0) return cellId[0];
return cellId[0]; else return -1;
else
return -1;
} }
/** /**
@@ -510,8 +504,7 @@ public class SmallBodyModel {
public Vector3D findEastVector(double[] pt) { public Vector3D findEastVector(double[] pt) {
// define a topographic frame where the Z axis points up and the Y axis points north. The X axis // define a topographic frame where the Z axis points up and the Y axis points north. The X axis
// will point east. // will point east.
Rotation bodyFixedToTopo = Rotation bodyFixedToTopo = RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
return bodyFixedToTopo.applyTo(Vector3D.PLUS_I); return bodyFixedToTopo.applyTo(Vector3D.PLUS_I);
} }
@@ -525,8 +518,7 @@ public class SmallBodyModel {
public Vector3D findWestVector(double[] pt) { public Vector3D findWestVector(double[] pt) {
// define a topographic frame where the Z axis points up and the Y axis points north. The X axis // define a topographic frame where the Z axis points up and the Y axis points north. The X axis
// will point east. // will point east.
Rotation bodyFixedToTopo = Rotation bodyFixedToTopo = RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
return bodyFixedToTopo.applyTo(Vector3D.MINUS_I); return bodyFixedToTopo.applyTo(Vector3D.MINUS_I);
} }
@@ -557,12 +549,14 @@ public class SmallBodyModel {
double[] pt1 = points.GetPoint(1); double[] pt1 = points.GetPoint(1);
double[] pt2 = points.GetPoint(2); double[] pt2 = points.GetPoint(2);
TriangularFacet facet = TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
stats.addValue(VectorIJK.subtract(facet.getVertex1(), facet.getVertex2()).getLength()); stats.addValue(
stats.addValue(VectorIJK.subtract(facet.getVertex2(), facet.getVertex3()).getLength()); VectorIJK.subtract(facet.getVertex1(), facet.getVertex2()).getLength());
stats.addValue(VectorIJK.subtract(facet.getVertex3(), facet.getVertex1()).getLength()); stats.addValue(
VectorIJK.subtract(facet.getVertex2(), facet.getVertex3()).getLength());
stats.addValue(
VectorIJK.subtract(facet.getVertex3(), facet.getVertex1()).getLength());
points.Delete(); points.Delete();
cell.Delete(); cell.Delete();
@@ -572,24 +566,17 @@ public class SmallBodyModel {
} }
public String getModelName() { public String getModelName() {
if (resolutionLevel >= 0 && resolutionLevel < modelNames.length) if (resolutionLevel >= 0 && resolutionLevel < modelNames.length) return modelNames[resolutionLevel];
return modelNames[resolutionLevel]; else return null;
else
return null;
} }
/** clean up VTK allocated internal objects */ /** clean up VTK allocated internal objects */
public void delete() { public void delete() {
if (cellLocator != null) if (cellLocator != null) cellLocator.Delete();
cellLocator.Delete(); if (bspLocator != null) bspLocator.Delete();
if (bspLocator != null) if (pointLocator != null) pointLocator.Delete();
bspLocator.Delete(); if (genericCell != null) genericCell.Delete();
if (pointLocator != null) if (smallBodyPolyData != null) smallBodyPolyData.Delete();
pointLocator.Delete();
if (genericCell != null)
genericCell.Delete();
if (smallBodyPolyData != null)
smallBodyPolyData.Delete();
} }
public void saveAsPLT(File file) throws IOException { public void saveAsPLT(File file) throws IOException {
@@ -607,5 +594,4 @@ public class SmallBodyModel {
public void saveAsSTL(File file) throws IOException { public void saveAsSTL(File file) throws IOException {
PolyDataUtil.saveShapeModelAsSTL(smallBodyPolyData, file.getAbsolutePath()); PolyDataUtil.saveShapeModelAsSTL(smallBodyPolyData, file.getAbsolutePath());
} }
} }

View File

@@ -58,8 +58,7 @@ public class DefaultTerrasaurTool implements TerrasaurTool {
private static Options defineOptions() { private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions(); Options options = TerrasaurTool.defineOptions();
options.addOption( options.addOption(Option.builder("env")
Option.builder("env")
.hasArgs() .hasArgs()
.required() .required()
.desc("Print the named environment variable's value. Can take multiple arguments.") .desc("Print the named environment variable's value. Can take multiple arguments.")
@@ -78,8 +77,7 @@ public class DefaultTerrasaurTool implements TerrasaurTool {
for (MessageLabel ml : startupMessages.keySet()) for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml))); logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
for (String env : cl.getOptionValues("env")) for (String env : cl.getOptionValues("env")) logger.info(String.format("%s: %s", env, System.getenv(env)));
logger.info(String.format("%s: %s", env, System.getenv(env)));
logger.info("Finished"); logger.info("Finished");
} }

View File

@@ -22,17 +22,16 @@
*/ */
package terrasaur.templates; package terrasaur.templates;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.Level;
import terrasaur.utils.AppVersion;
import terrasaur.utils.Log4j2Configurator;
import java.io.PrintWriter; import java.io.PrintWriter;
import java.io.StringWriter; import java.io.StringWriter;
import java.net.InetAddress; import java.net.InetAddress;
import java.net.UnknownHostException; import java.net.UnknownHostException;
import java.util.LinkedHashMap; import java.util.LinkedHashMap;
import java.util.Map; import java.util.Map;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.Level;
import terrasaur.utils.AppVersion;
import terrasaur.utils.Log4j2Configurator;
/** /**
* All classes in the apps folder should implement this interface. Calling the class without * All classes in the apps folder should implement this interface. Calling the class without
@@ -46,8 +45,7 @@ public interface TerrasaurTool {
/** Show required options first, followed by non-required. */ /** Show required options first, followed by non-required. */
class CustomHelpFormatter extends HelpFormatter { class CustomHelpFormatter extends HelpFormatter {
public CustomHelpFormatter() { public CustomHelpFormatter() {
setOptionComparator( setOptionComparator((o1, o2) -> {
(o1, o2) -> {
if (o1.isRequired() && !o2.isRequired()) return -1; if (o1.isRequired() && !o2.isRequired()) return -1;
if (!o1.isRequired() && o2.isRequired()) return 1; if (!o1.isRequired() && o2.isRequired()) return 1;
return o1.getKey().compareToIgnoreCase(o2.getKey()); return o1.getKey().compareToIgnoreCase(o2.getKey());
@@ -132,18 +130,15 @@ public interface TerrasaurTool {
*/ */
static Options defineOptions() { static Options defineOptions() {
Options options = new Options(); Options options = new Options();
options.addOption( options.addOption(Option.builder("logFile")
Option.builder("logFile")
.hasArg() .hasArg()
.desc("If present, save screen output to log file.") .desc("If present, save screen output to log file.")
.build()); .build());
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
for (Level l : Level.values()) sb.append(String.format("%s ", l.name())); for (Level l : Level.values()) sb.append(String.format("%s ", l.name()));
options.addOption( options.addOption(Option.builder("logLevel")
Option.builder("logLevel")
.hasArg() .hasArg()
.desc( .desc("If present, print messages above selected priority. Valid values are "
"If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + sb.toString().trim()
+ ". Default is INFO.") + ". Default is INFO.")
.build()); .build());
@@ -198,15 +193,13 @@ public interface TerrasaurTool {
Log4j2Configurator lc = Log4j2Configurator.getInstance(); Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel")) if (cl.hasOption("logLevel"))
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim())); lc.setLevel(
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile")); if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
StringBuilder sb = StringBuilder sb = new StringBuilder(
new StringBuilder( String.format("%s [%s] on %s", getClass().getSimpleName(), AppVersion.getVersionString(), hostname));
String.format(
"%s [%s] on %s",
getClass().getSimpleName(), AppVersion.getVersionString(), hostname));
startupMessages.put(MessageLabel.START, sb.toString()); startupMessages.put(MessageLabel.START, sb.toString());
sb = new StringBuilder(); sb = new StringBuilder();
@@ -226,5 +219,4 @@ public interface TerrasaurTool {
return startupMessages; return startupMessages;
} }
} }

View File

@@ -24,26 +24,25 @@
package terrasaur.utils; package terrasaur.utils;
public class AppVersion { public class AppVersion {
public final static String lastCommit = "25.04.27"; public static final String lastCommit = "25.07.30";
// an M at the end of gitRevision means this was built from a "dirty" git repository // an M at the end of gitRevision means this was built from a "dirty" git repository
public final static String gitRevision = "cb0f7f8"; public static final String gitRevision = "6212144";
public final static String applicationName = "Terrasaur"; public static final String applicationName = "Terrasaur";
public final static String dateString = "2025-Apr-28 15:06:13 UTC"; public static final String dateString = "2025-Jul-30 16:05:45 UTC";
private AppVersion() {} private AppVersion() {}
/** /**
* Terrasaur version 25.04.27-cb0f7f8 built 2025-Apr-28 15:06:13 UTC * Terrasaur version 25.07.30-6212144 built 2025-Jul-30 16:05:45 UTC
*/ */
public static String getFullString() { public static String getFullString() {
return String.format("%s version %s-%s built %s", applicationName, lastCommit, gitRevision, dateString); return String.format("%s version %s-%s built %s", applicationName, lastCommit, gitRevision, dateString);
} }
/** /**
* Terrasaur version 25.04.27-cb0f7f8 * Terrasaur version 25.07.30-6212144
*/ */
public static String getVersionString() { public static String getVersionString() {
return String.format("%s version %s-%s", applicationName, lastCommit, gitRevision); return String.format("%s version %s-%s", applicationName, lastCommit, gitRevision);
} }
} }

View File

@@ -86,7 +86,9 @@ public class Binary16 {
if ((fbits & 0x7fffffff) >= 0x47800000) { // is or must become NaN/Inf if ((fbits & 0x7fffffff) >= 0x47800000) { // is or must become NaN/Inf
if (val < 0x7f800000) // was value but too large if (val < 0x7f800000) // was value but too large
return sign | 0x7c00; // make it +/-Inf return sign | 0x7c00; // make it +/-Inf
return sign | 0x7c00 | // remains +/-Inf or NaN return sign
| 0x7c00
| // remains +/-Inf or NaN
(fbits & 0x007fffff) >>> 13; // keep NaN (and Inf) bits (fbits & 0x007fffff) >>> 13; // keep NaN (and Inf) bits
} }
return sign | 0x7bff; // unrounded not quite Inf return sign | 0x7bff; // unrounded not quite Inf
@@ -96,9 +98,9 @@ public class Binary16 {
if (val < 0x33000000) // too small for subnormal if (val < 0x33000000) // too small for subnormal
return sign; // becomes +/-0 return sign; // becomes +/-0
val = (fbits & 0x7fffffff) >>> 23; // tmp exp for subnormal calc val = (fbits & 0x7fffffff) >>> 23; // tmp exp for subnormal calc
return sign | ((fbits & 0x7fffff | 0x800000) // add subnormal bit return sign
| ((fbits & 0x7fffff | 0x800000) // add subnormal bit
+ (0x800000 >>> val - 102) // round depending on cut off + (0x800000 >>> val - 102) // round depending on cut off
>>> 126 - val); // div by 2^(1-(exp-127+15)) and >> 13 | exp=0 >>> 126 - val); // div by 2^(1-(exp-127+15)) and >> 13 | exp=0
} }
} }

View File

@@ -34,34 +34,33 @@ import java.io.IOException;
*/ */
public class BinaryUtils { public class BinaryUtils {
static public float readFloatAndSwap(DataInputStream is) throws IOException { public static float readFloatAndSwap(DataInputStream is) throws IOException {
int intValue = is.readInt(); int intValue = is.readInt();
intValue = ByteSwapper.swap(intValue); intValue = ByteSwapper.swap(intValue);
return Float.intBitsToFloat(intValue); return Float.intBitsToFloat(intValue);
} }
static public double readDoubleAndSwap(DataInputStream is) throws IOException { public static double readDoubleAndSwap(DataInputStream is) throws IOException {
long longValue = is.readLong(); long longValue = is.readLong();
longValue = ByteSwapper.swap(longValue); longValue = ByteSwapper.swap(longValue);
return Double.longBitsToDouble(longValue); return Double.longBitsToDouble(longValue);
} }
static public void writeFloatAndSwap(DataOutputStream os, float value) throws IOException { public static void writeFloatAndSwap(DataOutputStream os, float value) throws IOException {
int intValue = Float.floatToRawIntBits(value); int intValue = Float.floatToRawIntBits(value);
intValue = ByteSwapper.swap(intValue); intValue = ByteSwapper.swap(intValue);
os.writeInt(intValue); os.writeInt(intValue);
} }
static public void writeDoubleAndSwap(DataOutputStream os, double value) throws IOException { public static void writeDoubleAndSwap(DataOutputStream os, double value) throws IOException {
long longValue = Double.doubleToRawLongBits(value); long longValue = Double.doubleToRawLongBits(value);
longValue = ByteSwapper.swap(longValue); longValue = ByteSwapper.swap(longValue);
os.writeLong(longValue); os.writeLong(longValue);
} }
// This function is taken from // This function is taken from
// http://www.java2s.com/Code/Java/Language-Basics/Utilityforbyteswappingofalljavadatatypes.htm // http://www.java2s.com/Code/Java/Language-Basics/Utilityforbyteswappingofalljavadatatypes.htm
static public short swap(short value) { public static short swap(short value) {
int b1 = value & 0xff; int b1 = value & 0xff;
int b2 = (value >> 8) & 0xff; int b2 = (value >> 8) & 0xff;
@@ -70,7 +69,7 @@ public class BinaryUtils {
// This function is taken from // This function is taken from
// http://www.java2s.com/Code/Java/Language-Basics/Utilityforbyteswappingofalljavadatatypes.htm // http://www.java2s.com/Code/Java/Language-Basics/Utilityforbyteswappingofalljavadatatypes.htm
static public int swap(int value) { public static int swap(int value) {
int b1 = (value >> 0) & 0xff; int b1 = (value >> 0) & 0xff;
int b2 = (value >> 8) & 0xff; int b2 = (value >> 8) & 0xff;
int b3 = (value >> 16) & 0xff; int b3 = (value >> 16) & 0xff;
@@ -108,5 +107,4 @@ public class BinaryUtils {
intValue = swap(intValue); intValue = swap(intValue);
return Float.intBitsToFloat(intValue); return Float.intBitsToFloat(intValue);
} }
} }

View File

@@ -40,8 +40,6 @@ package terrasaur.utils;
// package no.geosoft.cc.util; // package no.geosoft.cc.util;
/** /**
* Utility class for doing byte swapping (i.e. conversion between little-endian and big-endian * Utility class for doing byte swapping (i.e. conversion between little-endian and big-endian
* representations) of different data types. Byte swapping is typically used when data is read from * representations) of different data types. Byte swapping is typically used when data is read from
@@ -63,8 +61,6 @@ public class ByteSwapper {
return (short) (b1 << 8 | b2 << 0); return (short) (b1 << 8 | b2 << 0);
} }
/** /**
* Byte swap a single int value. * Byte swap a single int value.
* *
@@ -80,8 +76,6 @@ public class ByteSwapper {
return b1 << 24 | b2 << 16 | b3 << 8 | b4 << 0; return b1 << 24 | b2 << 16 | b3 << 8 | b4 << 0;
} }
/** /**
* Byte swap a single long value. * Byte swap a single long value.
* *
@@ -101,8 +95,6 @@ public class ByteSwapper {
return b1 << 56 | b2 << 48 | b3 << 40 | b4 << 32 | b5 << 24 | b6 << 16 | b7 << 8 | b8 << 0; return b1 << 56 | b2 << 48 | b3 << 40 | b4 << 32 | b5 << 24 | b6 << 16 | b7 << 8 | b8 << 0;
} }
/** /**
* Byte swap a single float value. * Byte swap a single float value.
* *
@@ -115,8 +107,6 @@ public class ByteSwapper {
return Float.intBitsToFloat(intValue); return Float.intBitsToFloat(intValue);
} }
/** /**
* Byte swap a single double value. * Byte swap a single double value.
* *
@@ -129,64 +119,48 @@ public class ByteSwapper {
return Double.longBitsToDouble(longValue); return Double.longBitsToDouble(longValue);
} }
/** /**
* Byte swap an array of shorts. The result of the swapping is put back into the specified array. * Byte swap an array of shorts. The result of the swapping is put back into the specified array.
* *
* @param array Array of values to swap * @param array Array of values to swap
*/ */
public static void swap(short[] array) { public static void swap(short[] array) {
for (int i = 0; i < array.length; i++) for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
array[i] = swap(array[i]);
} }
/** /**
* Byte swap an array of ints. The result of the swapping is put back into the specified array. * Byte swap an array of ints. The result of the swapping is put back into the specified array.
* *
* @param array Array of values to swap * @param array Array of values to swap
*/ */
public static void swap(int[] array) { public static void swap(int[] array) {
for (int i = 0; i < array.length; i++) for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
array[i] = swap(array[i]);
} }
/** /**
* Byte swap an array of longs. The result of the swapping is put back into the specified array. * Byte swap an array of longs. The result of the swapping is put back into the specified array.
* *
* @param array Array of values to swap * @param array Array of values to swap
*/ */
public static void swap(long[] array) { public static void swap(long[] array) {
for (int i = 0; i < array.length; i++) for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
array[i] = swap(array[i]);
} }
/** /**
* Byte swap an array of floats. The result of the swapping is put back into the specified array. * Byte swap an array of floats. The result of the swapping is put back into the specified array.
* *
* @param array Array of values to swap * @param array Array of values to swap
*/ */
public static void swap(float[] array) { public static void swap(float[] array) {
for (int i = 0; i < array.length; i++) for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
array[i] = swap(array[i]);
} }
/** /**
* Byte swap an array of doubles. The result of the swapping is put back into the specified array. * Byte swap an array of doubles. The result of the swapping is put back into the specified array.
* *
* @param array Array of values to swap * @param array Array of values to swap
*/ */
public static void swap(double[] array) { public static void swap(double[] array) {
for (int i = 0; i < array.length; i++) for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
array[i] = swap(array[i]);
} }
} }

Some files were not shown because too many files have changed in this diff Show More