Merge pull request #6 from JHUAPL/5-update-to-v110

merge 5 update to v110
This commit is contained in:
Hari Nair
2025-07-30 12:13:27 -04:00
committed by GitHub
226 changed files with 36729 additions and 36147 deletions

View File

@@ -1,5 +1,20 @@
# Terrasaur Changelog
## July 30, 2025 - v1.1.0
- Updates to existing tools
- AdjustShapeModelToOtherShapeModel
- fix intersection bug
- CreateSBMTStructure
- new options: -flipX, -flipY, -spice, -date, -observer, -target, -cameraFrame
- ValidateNormals
- new option: -fast to only check for overhangs if center and normal point in opposite directions
- New tools:
- FacetInfo: Print info about a facet
- PointCloudOverlap: Find points in a point cloud which overlap a reference point cloud
- TriAx: Generate a triaxial ellipsoid in ICQ format
## April 28, 2025 - v1.0.1
- Add MIT license to repository and source code

Binary file not shown.

Binary file not shown.

View File

@@ -16,7 +16,7 @@ The Terrasaur package requires Java 21 or later. Some freely available versions
Download
~~~~~~~~
Binary packages for use on Mac OS X and Linux are available at ...
Binary packages for use on Mac OS X and Linux are available at `GitHub <https://github.com/JHUAPL/Terrasaur/releases>`__.
We have not tried using the softare on Microsoft Windows, but users may try the Linux package with the `Windows Subsystem for Linux <https://docs.microsoft.com/en-us/windows/wsl/>`__.

70
doc/tools/ColorSpots.rst Normal file
View File

@@ -0,0 +1,70 @@
.. _ColorSpots:
##########
ColorSpots
##########
ColorSpots takes as input a shape model and a file containing (x, y, z, value),
or (lat, lon, value). It writes out the mean and standard deviation of values
within a specified range for each facet.
.. include:: ../toolDescriptions/ColorSpots.txt
:literal:
********
Examples
********
Download the :download:`Apophis<./support_files/apophis_g_15618mm_rdr_obj_0000n00000_v001.obj>`
shape model and the :download:`info<./support_files/xyzrandom.txt>` file containing
cartesian coordinates and a random value.
Run ColorSpots:
::
ColorSpots -obj apophis_g_15618mm_rdr_obj_0000n00000_v001.obj -xyz \
-info xyzrandom.txt -outFile apophis_value_at_vertex.csv -noWeight \
-allFacets -additionalFields n -searchRadius 0.015 -writeVertices
The first few lines of apophis_value_at_vertex.csv look like:
::
% head apophis_value_at_vertex.csv
0.000000e+00, 0.000000e+00, 1.664960e-01, -3.805764e-02, 5.342315e-01, 4.000000e+01
1.589500e-02, 0.000000e+00, 1.591030e-01, 6.122849e-02, 6.017192e-01, 5.000000e+01
7.837000e-03, 1.486800e-02, 1.591670e-01, -6.072964e-03, 5.220682e-01, 5.700000e+01
-7.747000e-03, 1.506300e-02, 1.621040e-01, 9.146163e-02, 5.488631e-01, 4.900000e+01
-1.554900e-02, 0.000000e+00, 1.657970e-01, -8.172811e-03, 5.270302e-01, 3.400000e+01
-7.982000e-03, -1.571100e-02, 1.694510e-01, -2.840524e-02, 5.045911e-01, 3.900000e+01
8.060000e-03, -1.543300e-02, 1.655150e-01, 3.531959e-02, 5.464390e-01, 4.900000e+01
3.179500e-02, 0.000000e+00, 1.515820e-01, -1.472434e-02, 5.967265e-01, 5.400000e+01
2.719700e-02, 1.658200e-02, 1.508930e-01, -9.050683e-03, 5.186966e-01, 4.700000e+01
1.554100e-02, 2.901300e-02, 1.530770e-01, -7.053547e-02, 4.980369e-01, 7.000000e+01
The columns are:
.. list-table:: ColorSpots Vertex output
:header-rows: 1
* - Column
- Value
* - 1
- X
* - 2
- Y
* - 3
- Z
* - 4
- mean value in region
* - 5
- standard deviation in region
* - 6
- number of points in region
.. figure:: images/ColorSpots-n.png
:alt: Number of points in region at each vertex
This image shows the number of points in the region at each vertex.

View File

@@ -23,7 +23,7 @@ Local Model Comparison
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
shape models. You can view them in a tool such as
`ParaView<https://www.paraview.org/>`.
`ParaView <https://www.paraview.org/>`__.
.. figure:: images/CompareOBJ_local_1.png
@@ -32,8 +32,8 @@ shape models. You can view them in a tool such as
Run CompareOBJ to find the optimal transform to align the comparison with the reference:
::
CompareOBJ -computeOptimalRotationAndTranslation -model F3H-1/EVAL20.obj \
-reference F3H-1/EVAL20_wtr.obj -computeVerticalError verticalError.txt \
CompareOBJ -computeOptimalRotationAndTranslation -model EVAL20.obj \
-reference EVAL20_wtr.obj -computeVerticalError verticalError.txt \
-saveOptimalShape optimal.obj -savePlateDiff plateDiff.txt -savePlateIndex plateIndex.txt
The screen output is
@@ -77,7 +77,7 @@ model for comparison:
::
ShapeFormatConverter -input Bennu/Bennu49k.obj -output BennuComparison.obj \
ShapeFormatConverter -input Bennu49k.obj -output BennuComparison.obj \
-rotate 5,0,0,1 -translate 0.01,-0.01,0.01
This rotates the shape model by 5 degrees about the z axis and then translates
@@ -94,11 +94,11 @@ Run CompareOBJ to find the optimal transform to align the comparison with the re
CompareOBJ -computeOptimalRotationAndTranslation \
-model BennuComparison.obj \
-reference Bennu/Bennu49k.obj \
-computeVerticalError CompareOBJ/terrasaur-verticalError.txt \
-saveOptimalShape CompareOBJ/terrasaur-optimal.obj \
-savePlateDiff CompareOBJ/terrasaur-plateDiff.txt \
-savePlateIndex CompareOBJ/terrasaur-plateIndex.txt
-reference Bennu49k.obj \
-computeVerticalError terrasaur-verticalError.txt \
-saveOptimalShape terrasaur-optimal.obj \
-savePlateDiff terrasaur-plateDiff.txt \
-savePlateIndex terrasaur-plateIndex.txt
The screen output is

View File

@@ -0,0 +1,38 @@
.. _PointCloudOverlap:
#################
PointCloudOverlap
#################
*****
Usage
*****
.. include:: ../toolDescriptions/PointCloudOverlap.txt
:literal:
********
Examples
********
Download the :download:`reference<./support_files/EVAL20_wtr.obj>` and :download:`comparison<./support_files/EVAL20.obj>`
shape models. You can view them in a tool such as
`ParaView <https://www.paraview.org/>`__.
.. figure:: images/PointCloudOverlap_1.png
This image shows the reference (pink) and input (blue) shape models.
Run PointCloudOverlap:
::
PointCloudOverlap -inputFile EVAL20.obj -referenceFile EVAL20_wtr.obj -outputFile overlap.vtk
Note that OBJ is supported as an input format but not as an output format.
.. figure:: images/PointCloudOverlap_2.png
The points in white are those in the input model that overlap the reference.

43
doc/tools/TriAx.rst Normal file
View File

@@ -0,0 +1,43 @@
.. _TriAx:
=====
TriAx
=====
TriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.
*****
Usage
*****
.. include:: ../toolDescriptions/TriAx.txt
:literal:
*******
Example
*******
Generate an ellipsoid with dimensions 10, 8, 6, with q = 8.
::
TriAx -A 10 -B 8 -C 6 -Q 8 -output triax.icq -saveOBJ
The following ellipsoid is generated:
.. container:: figures-row
.. figure:: images/TriAx_X.png
:alt: looking down from the +X direction
looking down from the +X direction
.. figure:: images/TriAx_Y.png
:alt: looking down from the +Y direction
looking down from the +Y direction
.. figure:: images/TriAx_Z.png
:alt: looking down from the +Z direction
looking down from the +Z direction

Binary file not shown.

After

Width:  |  Height:  |  Size: 174 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 139 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

View File

@@ -89,7 +89,7 @@ function build_jar() {
function make_scripts() {
classes=$(jar tf "${scriptPath}"/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep class)
classes=$(jar tf ${scriptPath}/target/${packageName}.jar | grep $appSrcDir | grep -v '\$' | grep -v "package-info" | grep -v "Immutable" | grep class)
for class in $classes; do
base=$(basename "$class" ".class")

50
pom.xml
View File

@@ -1,5 +1,6 @@
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>edu.jhuapl.ses.srn</groupId>
<artifactId>Terrasaur</artifactId>
@@ -42,7 +43,7 @@
<maven.compiler.release>21</maven.compiler.release>
<javafx.version>21.0.5</javafx.version>
<immutables.version>2.10.1</immutables.version>
<immutables.version>2.11.1</immutables.version>
<jackfruit.version>1.1.1</jackfruit.version>
</properties>
@@ -62,6 +63,10 @@
<includes>
<include>**/*.java</include>
</includes>
<excludes>
<exclude>3rd-party/**/*.java</exclude>
<exclude>support-libraries/3rd-party/**/*.java</exclude>
</excludes>
</licenseSet>
</licenseSets>
</configuration>
@@ -126,14 +131,15 @@
<artifactId>maven-surefire-plugin</artifactId>
<version>3.5.3</version>
<configuration>
<argLine> -Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
<argLine>
-Djava.library.path=${project.basedir}/3rd-party/${env.ARCH}/spice/JNISpice/lib
</argLine>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>3.5.0</version>
<version>3.6.1</version>
<executions>
<execution>
<id>enforce-maven</id>
@@ -207,7 +213,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>3.5.0</version>
<version>3.5.1</version>
<executions>
<execution>
<phase>generate-sources</phase>
@@ -222,6 +228,18 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.diffplug.spotless</groupId>
<artifactId>spotless-maven-plugin</artifactId>
<version>2.46.1</version>
<configuration>
<java>
<palantirJavaFormat />
</java>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
@@ -264,7 +282,7 @@
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.10.0</version>
<version>1.11.0</version>
</dependency>
<dependency>
<groupId>commons-cli</groupId>
@@ -274,17 +292,22 @@
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-configuration2</artifactId>
<version>2.11.0</version>
<version>2.12.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.13.0</version>
<version>1.14.1</version>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.18.0</version>
<version>2.20.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.14.0</version>
</dependency>
<dependency>
<groupId>com.beust</groupId>
@@ -294,7 +317,7 @@
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.12.1</version>
<version>2.13.1</version>
</dependency>
<dependency>
@@ -315,18 +338,13 @@
<dependency>
<groupId>gov.nasa.gsfc.heasarc</groupId>
<artifactId>nom-tam-fits</artifactId>
<version>1.20.2</version>
<version>1.21.1</version>
</dependency>
<dependency>
<groupId>gov.nasa.jpl.naif</groupId>
<artifactId>spice</artifactId>
<version>N0067</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>

View File

@@ -47,8 +47,7 @@ public class ALTWGProductNamer implements ProductNamer {
String[] fields = productName.split("_");
String returnField = "ERROR";
if (fieldNum > fields.length) {
System.out.println(
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField);
} else {
returnField = fields[fieldNum];
@@ -57,8 +56,7 @@ public class ALTWGProductNamer implements ProductNamer {
}
@Override
public String productbaseName(
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
String gsd = "gsd";
String dataSrc = "dataSrc";
@@ -126,8 +124,7 @@ public class ALTWGProductNamer implements ProductNamer {
String clahLon = ALTWGProductNamer.clahLon(cLat, cLon);
// pds likes having filenames all in the same case, so chose lowercase
String outFile =
ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
String outFile = ALTWGProductNamer.altwgBaseName(region, gsd, dataSrc, productType, clahLon, prodVer);
return outFile;
}
@@ -167,8 +164,7 @@ public class ALTWGProductNamer implements ProductNamer {
if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3);
System.out.println(
"Will set data source to:"
System.out.println("Will set data source to:"
+ dataSource
+ " but"
+ " this might NOT conform to the ALTWG naming convention!");
@@ -314,8 +310,7 @@ public class ALTWGProductNamer implements ProductNamer {
}
if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999
System.out.println(
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999");
gsdD = -999D;
}

View File

@@ -50,8 +50,7 @@ public class AltwgMLNNamer implements ProductNamer {
* @param isGlobal - N/A. Included here as part of the interface structure.
*/
@Override
public String productbaseName(
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
// initialize string fragments for NFT name. This will help identify
// which string fragments have not been updated by the method.
@@ -187,8 +186,7 @@ public class AltwgMLNNamer implements ProductNamer {
System.out.println("file units:" + fileUnits);
} else {
String errMesg =
"ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
String errMesg = "ERROR! Could not find keyword:" + HeaderTag.GSD.toString() + " in hdrBuilder";
throw new RuntimeException(errMesg);
}

View File

@@ -54,8 +54,7 @@ public class DartNamer implements ProductNamer {
if (dataSource.length() > 3) {
System.out.println("WARNING! dataSource:" + dataSource + " longer than 3 chars!");
dataSource = dataSource.substring(0, 3);
System.out.println(
"Will set data source to:"
System.out.println("Will set data source to:"
+ dataSource
+ " but"
+ " this might NOT conform to the ALTWG naming convention!");
@@ -96,8 +95,7 @@ public class DartNamer implements ProductNamer {
String[] fields = productName.split("_");
String returnField = "ERROR";
if (fieldNum > fields.length) {
System.out.println(
"ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("ERROR, field:" + fieldNum + " requested is beyond the number of fields found.");
System.out.println("returning:" + returnField);
} else {
returnField = fields[fieldNum];
@@ -106,8 +104,7 @@ public class DartNamer implements ProductNamer {
}
@Override
public String productbaseName(
FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal) {
String gsd = "gsd";
String dataSrc = "dataSrc";
@@ -249,8 +246,7 @@ public class DartNamer implements ProductNamer {
}
if (Double.isNaN(gsdD)) {
// still cannot parse gsd. Set to -999
System.out.println(
"WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("WARNING: No valid values of GSD or GSDI could be parsed from fits header!");
System.out.println("Setting gsd = -999");
gsdD = -999D;
}

View File

@@ -29,8 +29,11 @@ package terrasaur.altwg.pipeline;
*
*/
public enum NameConvention {
ALTPRODUCT, ALTNFTMLN, DARTPRODUCT, NOMATCH, NONEUSED;
ALTPRODUCT,
ALTNFTMLN,
DARTPRODUCT,
NOMATCH,
NONEUSED;
public static NameConvention parseNameConvention(String name) {
for (NameConvention nameConvention : values()) {
@@ -40,9 +43,8 @@ public enum NameConvention {
}
}
NameConvention nameConvention = NameConvention.NOMATCH;
System.out
.println("NameConvention.parseNameConvention()" + " could not parse naming convention:"
+ name + ". Returning:" + nameConvention.toString());
System.out.println("NameConvention.parseNameConvention()" + " could not parse naming convention:" + name
+ ". Returning:" + nameConvention.toString());
return nameConvention;
}
}

View File

@@ -30,6 +30,11 @@ package terrasaur.altwg.pipeline;
*
*/
public enum NameFields {
GSD, DATATYPE, VERSION, DATASRC, CLATLON, REGION, TBODY;
GSD,
DATATYPE,
VERSION,
DATASRC,
CLATLON,
REGION,
TBODY;
}

View File

@@ -48,8 +48,7 @@ public class NamingFactory {
return new DartNamer();
default:
System.err.println(
"ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
System.err.println("ERROR! Naming convention:" + namingConvention.toString() + " not supported!");
throw new RuntimeException();
}
}
@@ -60,8 +59,7 @@ public class NamingFactory {
* @param pipeConfig
* @return
*/
public static ProductNamer parseNamingConvention(
Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
public static ProductNamer parseNamingConvention(Map<AltPipelnEnum, String> pipeConfig, boolean verbose) {
ProductNamer productNamer = null;

View File

@@ -29,8 +29,7 @@ public interface ProductNamer {
public String getNameFrag(String productName, int fieldNum);
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct,
boolean isGlobal);
public String productbaseName(FitsHdrBuilder hdrBuilder, AltwgDataType altwgProduct, boolean isGlobal);
public String getVersion(FitsHdrBuilder hdrBuilder);
@@ -39,5 +38,4 @@ public interface ProductNamer {
public NameConvention getNameConvention();
public double gsdFromFilename(String filename);
}

View File

@@ -25,7 +25,6 @@ package terrasaur.apps;
import java.io.File;
import java.nio.charset.Charset;
import java.util.*;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
@@ -33,14 +32,14 @@ import org.apache.commons.io.FileUtils;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import spice.basic.Plane;
import spice.basic.Vector3;
import terrasaur.smallBodyModel.BoundingBox;
import terrasaur.smallBodyModel.SmallBodyModel;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.Log4j2Configurator;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataUtil;
import spice.basic.Plane;
import spice.basic.Vector3;
import vtk.vtkGenericCell;
import vtk.vtkPoints;
import vtk.vtkPolyData;
@@ -78,26 +77,20 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("from")
options.addOption(Option.builder("from")
.hasArg()
.desc(
"path to first shape model in OBJ format which will get shifted to the second shape model")
.desc("path to first shape model in OBJ format which will get shifted to the second shape model")
.build());
options.addOption(
Option.builder("to")
options.addOption(Option.builder("to")
.hasArg()
.desc(
"path to second shape model in OBJ format which the first shape model will try to match to")
.desc("path to second shape model in OBJ format which the first shape model will try to match to")
.build());
options.addOption(
Option.builder("output")
options.addOption(Option.builder("output")
.hasArg()
.desc(
"path to adjusted shape model in OBJ format generated by this program by shifting first to second")
.build());
options.addOption(
Option.builder("filelist")
options.addOption(Option.builder("filelist")
.desc(
"""
If specified then the second required argument to this program,
@@ -109,8 +102,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
However, the global shape model formed when all these pieces are
combined together, may not have any holes or gaps.""")
.build());
options.addOption(
Option.builder("fit-plane-radius")
options.addOption(Option.builder("fit-plane-radius")
.hasArg()
.desc(
"""
@@ -119,8 +111,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
Use this normal to adjust the point to the second shape model rather
than the radial vector.""")
.build());
options.addOption(
Option.builder("local")
options.addOption(Option.builder("local")
.desc(
"""
Use when adjusting a local OBJ file to another. The best fit plane to the
@@ -139,10 +130,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
}
public static void adjustShapeModelToOtherShapeModel(
vtkPolyData frompolydata,
ArrayList<vtkPolyData> topolydata,
double planeRadius,
boolean localModel)
vtkPolyData frompolydata, ArrayList<vtkPolyData> topolydata, double planeRadius, boolean localModel)
throws Exception {
vtkPoints points = frompolydata.GetPoints();
long numberPoints = frompolydata.GetNumberOfPoints();
@@ -173,8 +161,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
// fit a plane to the local model and check that the normal points outward
Plane localPlane = PolyDataUtil.fitPlaneToPolyData(frompolydata);
Vector3 localNormalVector = localPlane.getNormal();
if (localNormalVector.dot(localPlane.getPoint()) < 0)
localNormalVector = localNormalVector.negate();
if (localNormalVector.dot(localPlane.getPoint()) < 0) localNormalVector = localNormalVector.negate();
localNormal = localNormalVector.toArray();
}
@@ -182,6 +169,7 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
Vector3D origin = new Vector3D(0., 0., 0.);
for (int i = 0; i < numberPoints; ++i) {
points.GetPoint(i, p);
Vector3D thisPoint = new Vector3D(p);
Vector3D lookDir;
@@ -198,59 +186,37 @@ public class AdjustShapeModelToOtherShapeModel implements TerrasaurTool {
}
Vector3D lookPt = lookDir.scalarMultiply(diagonalLength);
lookPt = lookPt.add(origin);
lookPt = lookPt.add(thisPoint);
List<Vector3D> intersections = new ArrayList<>();
for (vtksbCellLocator cellLocator : cellLocators) {
double[] intersectPoint = new double[3];
// trace ray from the lookPt to the origin - first intersection is the farthest intersection
// from the origin
int result =
cellLocator.IntersectWithLine(
lookPt.toArray(),
origin.toArray(),
tol,
t,
intersectPoint,
pcoords,
subId,
cell_id,
cell);
Vector3D intersectVector = new Vector3D(intersectPoint);
if (fitPlane || localModel) {
// trace ray from thisPoint to the lookPt - Assume cell intersection is the closest one if
// there are multiple?
// NOTE: result should return 1 in case of intersection but doesn't sometimes.
// Use the norm of intersection point to test for intersection instead.
int result = cellLocator.IntersectWithLine(
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
Vector3D intersectVector = new Vector3D(intersectPoint);
NavigableMap<Double, Vector3D> pointsMap = new TreeMap<>();
if (intersectVector.getNorm() > 0) {
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector);
pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
}
// look in the other direction
lookPt = lookDir.scalarMultiply(-diagonalLength);
lookPt = lookPt.add(origin);
result =
cellLocator.IntersectWithLine(
lookPt.toArray(),
origin.toArray(),
tol,
t,
intersectPoint,
pcoords,
subId,
cell_id,
cell);
lookPt = lookPt.add(thisPoint);
result = cellLocator.IntersectWithLine(
thisPoint.toArray(), lookPt.toArray(), tol, t, intersectPoint, pcoords, subId, cell_id, cell);
intersectVector = new Vector3D(intersectPoint);
if (intersectVector.getNorm() > 0) {
pointsMap.put(origin.subtract(intersectVector).getNorm(), intersectVector);
pointsMap.put(thisPoint.subtract(intersectVector).getNorm(), intersectVector);
}
if (!pointsMap.isEmpty()) intersections.add(pointsMap.get(pointsMap.firstKey()));
} else {
if (result > 0) intersections.add(intersectVector);
}
}
if (intersections.isEmpty()) throw new Exception("Error: no intersections at all");

View File

@@ -59,48 +59,45 @@ public class AppendOBJ implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("boundary")
options.addOption(Option.builder("boundary")
.desc("Only save out boundary. This option implies -vtk.")
.build());
options.addOption(
Option.builder("decimate")
options.addOption(Option.builder("decimate")
.hasArg()
.desc(
"Reduce the number of facets in the output shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and <arg> is 0.90, "
+ "there will be approximately 10 facets after the decimation.")
.build());
options.addOption(
Option.builder("input")
options.addOption(Option.builder("input")
.required()
.hasArgs()
.desc(
"input file(s) to read. Format is derived from the allowed extension: "
.desc("input file(s) to read. Format is derived from the allowed extension: "
+ "icq, llr, obj, pds, plt, ply, stl, or vtk. Multiple files can be specified "
+ "with a single -input option, separated by whitespace. Alternatively, you may "
+ "specify -input multiple times.")
.build());
options.addOption(
Option.builder("output").required().hasArg().desc("output file to write.").build());
options.addOption(
Option.builder("vtk").desc("Save output file in VTK format rather than OBJ.").build());
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("output file to write.")
.build());
options.addOption(Option.builder("vtk")
.desc("Save output file in VTK format rather than OBJ.")
.build());
return options;
}
@@ -119,8 +116,7 @@ public class AppendOBJ implements TerrasaurTool {
boolean boundaryOnly = cl.hasOption("boundary");
boolean vtkFormat = boundaryOnly || cl.hasOption("vtk");
boolean decimate = cl.hasOption("decimate");
double decimationPercentage =
decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
double decimationPercentage = decimate ? Double.parseDouble(cl.getOptionValue("decimate")) : 1.0;
NativeLibraryLoader.loadVtkLibraries();

View File

@@ -41,7 +41,6 @@ public class BatchSubmit implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Run a command on a cluster.";
@@ -53,13 +52,11 @@ public class BatchSubmit implements TerrasaurTool {
String footer = "\nRun a command on a cluster.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("command")
options.addOption(Option.builder("command")
.required()
.hasArgs()
.desc("Required. Command(s) to run.")
@@ -67,14 +64,12 @@ public class BatchSubmit implements TerrasaurTool {
StringBuilder sb = new StringBuilder();
for (GridType type : GridType.values()) sb.append(String.format("%s ", type.name()));
options.addOption(
Option.builder("gridType")
options.addOption(Option.builder("gridType")
.hasArg()
.desc("Grid type. Valid values are " + sb + ". Default is LOCAL.")
.build());
options.addOption(
Option.builder("workingDir")
options.addOption(Option.builder("workingDir")
.hasArg()
.desc("Working directory to run command. Default is current directory.")
.build());
@@ -95,8 +90,7 @@ public class BatchSubmit implements TerrasaurTool {
List<String> cmdList = Arrays.asList(cl.getOptionValues("command"));
BatchType batchType = BatchType.GRID_ENGINE;
GridType gridType =
cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
GridType gridType = cl.hasOption("gridType") ? GridType.valueOf(cl.getOptionValue("gridType")) : GridType.LOCAL;
BatchSubmitI submitter = BatchSubmitFactory.getBatchSubmit(cmdList, batchType, gridType);
String workingDir = "";
@@ -106,5 +100,4 @@ public class BatchSubmit implements TerrasaurTool {
logger.error(e.getLocalizedMessage(), e);
}
}
}

View File

@@ -64,7 +64,7 @@ import terrasaur.utils.math.MathConversions;
public class CKFromSumFile implements TerrasaurTool {
private final static Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
@@ -79,20 +79,31 @@ public class CKFromSumFile implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("config").required().hasArg()
.desc("Required. Name of configuration file.").build());
options.addOption(Option.builder("dumpConfig").hasArg()
.desc("Write out an example configuration to the named file.").build());
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
options.addOption(Option.builder("config")
.required()
.hasArg()
.desc("Required. Name of configuration file.")
.build());
options.addOption(Option.builder("dumpConfig")
.hasArg()
.desc("Write out an example configuration to the named file.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("sumFile").hasArg().required().desc("""
options.addOption(Option.builder("sumFile")
.hasArg()
.required()
.desc(
"""
Required. File listing sumfiles to read. This is a text file, one per line.
Lines starting with # are ignored.
@@ -106,14 +117,18 @@ public class CKFromSumFile implements TerrasaurTool {
D717506131G0.SUM
# This is a comment
D717506132G0.SUM
""").build());
""")
.build());
return options;
}
private final CKFromSumFileConfig config;
private final NavigableMap<SumFile, String> sumFiles;
private CKFromSumFile(){config=null;sumFiles=null;}
private CKFromSumFile() {
config = null;
sumFiles = null;
}
public CKFromSumFile(CKFromSumFileConfig config, NavigableMap<SumFile, String> sumFiles) {
this.config = config;
@@ -136,8 +151,7 @@ public class CKFromSumFile implements TerrasaurTool {
File commentFile = new File(basename + "-comments.txt");
if (commentFile.exists())
if (!commentFile.delete())
logger.error("{} exists but cannot be deleted!", commentFile.getPath());
if (!commentFile.delete()) logger.error("{} exists but cannot be deleted!", commentFile.getPath());
String setupFile = basename + ".setup";
String inputFile = basename + ".inp";
@@ -153,14 +167,21 @@ public class CKFromSumFile implements TerrasaurTool {
sb.append(String.format("\t%s %s\n", sumFile.utcString(), sumFiles.get(sumFile)));
}
sb.append("\n");
sb.append("providing the orientation of ").append(scFrame.getName()).append(" with respect to ").append(config.J2000() ? "J2000" : bodyFixed.getName()).append(". ");
sb.append("providing the orientation of ")
.append(scFrame.getName())
.append(" with respect to ")
.append(config.J2000() ? "J2000" : bodyFixed.getName())
.append(". ");
double first = new TDBTime(sumFiles.firstKey().utcString()).getTDBSeconds();
double last = new TDBTime(sumFiles.lastKey().utcString()).getTDBSeconds() + config.extend();
sb.append("The coverage period is ").append(new TDBTime(first).toUTCString("ISOC", 3)).append(" to ").append(new TDBTime(last).toUTCString("ISOC", 3)).append(" UTC.");
sb.append("The coverage period is ")
.append(new TDBTime(first).toUTCString("ISOC", 3))
.append(" to ")
.append(new TDBTime(last).toUTCString("ISOC", 3))
.append(" UTC.");
String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n"))
pw.println(WordUtils.wrap(comment, 80));
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
@@ -171,10 +192,8 @@ public class CKFromSumFile implements TerrasaurTool {
map.put("CK_TYPE", "3");
map.put("COMMENTS_FILE_NAME", String.format("'%s'", commentFile.getPath()));
map.put("INSTRUMENT_ID", String.format("%d", scFrame.getIDCode()));
map.put("REFERENCE_FRAME_NAME",
String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
if (!config.fk().isEmpty())
map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
map.put("REFERENCE_FRAME_NAME", String.format("'%s'", config.J2000() ? "J2000" : bodyFixed.getName()));
if (!config.fk().isEmpty()) map.put("FRAMES_FILE_NAME", "'" + config.fk() + "'");
map.put("ANGULAR_RATE_PRESENT", "'MAKE UP/NO AVERAGING'");
map.put("INPUT_TIME_TYPE", "'UTC'");
map.put("INPUT_DATA_TYPE", "'SPICE QUATERNIONS'");
@@ -202,12 +221,9 @@ public class CKFromSumFile implements TerrasaurTool {
Vector3 row1 = rows[Math.abs(config.flipY()) - 1];
Vector3 row2 = rows[Math.abs(config.flipZ()) - 1];
if (config.flipX() < 0)
row0 = row0.negate();
if (config.flipY() < 0)
row1 = row1.negate();
if (config.flipZ() < 0)
row2 = row2.negate();
if (config.flipX() < 0) row0 = row0.negate();
if (config.flipY() < 0) row1 = row1.negate();
if (config.flipZ() < 0) row2 = row2.negate();
Matrix33 refToInstr = new Matrix33(row0, row1, row2);
@@ -231,8 +247,9 @@ public class CKFromSumFile implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(new FileWriter(inputFile))) {
for (double t : attitudeMap.keySet()) {
SpiceQuaternion q = attitudeMap.get(t);
pw.printf("%s %.14e %.14e %.14e %.14e\n", new TDBTime(t).toUTCString("ISOC", 6),
q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
pw.printf(
"%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6), q.getElt(0), q.getElt(1), q.getElt(2), q.getElt(3));
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
@@ -277,24 +294,21 @@ public class CKFromSumFile implements TerrasaurTool {
CKFromSumFileConfig appConfig = factory.fromConfig(config);
for (String kernel : appConfig.metakernel())
KernelDatabase.load(kernel);
for (String kernel : appConfig.metakernel()) KernelDatabase.load(kernel);
NavigableMap<SumFile, String> sumFiles = new TreeMap<>((o1, o2) -> {
try {
return Double.compare(new TDBTime(o1.utcString()).getTDBSeconds(),
new TDBTime(o2.utcString()).getTDBSeconds());
return Double.compare(
new TDBTime(o1.utcString()).getTDBSeconds(), new TDBTime(o2.utcString()).getTDBSeconds());
} catch (SpiceErrorException e) {
logger.error(e.getLocalizedMessage(), e);
}
return 0;
});
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("sumFile")), Charset.defaultCharset());
for (String line : lines) {
if (line.strip().startsWith("#"))
continue;
if (line.strip().startsWith("#")) continue;
String[] parts = line.strip().split("\\s+");
String filename = parts[0].trim();
sumFiles.put(SumFile.fromFile(new File(filename)), FilenameUtils.getBaseName(filename));
@@ -305,13 +319,10 @@ public class CKFromSumFile implements TerrasaurTool {
TDBTime end = new TDBTime(sumFiles.lastKey().utcString());
String picture = "YYYY_DOY";
String command = app.writeMSOPCKFiles(
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)),
new ArrayList<>());
String.format("ck_%s_%s", begin.toString(picture), end.toString(picture)), new ArrayList<>());
logger.info("To generate the CK, run:\n\t{}", command);
logger.info("Finished.");
}
}

View File

@@ -136,8 +136,10 @@ public class ColorSpots implements TerrasaurTool {
}
} else {
if (format == FORMAT.LLR) {
double lon = Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat = Math.toRadians(Double.parseDouble(record.get(1).trim()));
double lon =
Math.toRadians(Double.parseDouble(record.get(0).trim()));
double lat =
Math.toRadians(Double.parseDouble(record.get(1).trim()));
double rad = Double.parseDouble(record.get(2).trim());
Vector3D xyz = new Vector3D(lon, lat).scalarMultiply(rad);
values[0] = xyz.getX();
@@ -166,9 +168,7 @@ public class ColorSpots implements TerrasaurTool {
public TreeMap<Long, DescriptiveStatistics> getStatsFast(
ArrayList<double[]> valuesList, double radius, boolean weight, boolean atVertices) {
return atVertices
? getStatsVertex(valuesList, radius, weight)
: getStatsFacet(valuesList, radius, weight);
return atVertices ? getStatsVertex(valuesList, radius, weight) : getStatsFacet(valuesList, radius, weight);
}
private TreeMap<Long, DescriptiveStatistics> getStatsVertex(
@@ -257,8 +257,7 @@ public class ColorSpots implements TerrasaurTool {
return statMap;
}
public TreeMap<Integer, DescriptiveStatistics> getStats(
ArrayList<double[]> valuesList, double radius) {
public TreeMap<Integer, DescriptiveStatistics> getStats(ArrayList<double[]> valuesList, double radius) {
// for each value, store indices of closest cells and distances
TreeMap<Integer, ArrayList<Pair<Long, Double>>> closestCells = new TreeMap<>();
@@ -266,8 +265,7 @@ public class ColorSpots implements TerrasaurTool {
double[] values = valuesList.get(i);
Vector3D xyz = new Vector3D(values);
TreeSet<Long> sortedCellIDs =
new TreeSet<>(smallBodyModel.findClosestCellsWithinRadius(values, radius));
TreeSet<Long> sortedCellIDs = new TreeSet<>(smallBodyModel.findClosestCellsWithinRadius(values, radius));
sortedCellIDs.add(smallBodyModel.findClosestCell(values));
ArrayList<Pair<Long, Double>> distances = new ArrayList<>();
@@ -329,82 +327,69 @@ public class ColorSpots implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("additionalFields")
options.addOption(Option.builder("additionalFields")
.hasArg()
.desc(
"Specify additional fields to write out. Allowed values are min, max, median, n, rms, sum, std, variance. "
+ "More than one field may be specified in a comma separated list (e.g. "
+ "-additionalFields sum,median,rms). Additional fields will be written out after the mean and std columns.")
.build());
options.addOption(
Option.builder("allFacets")
.desc(
"Report values for all facets in OBJ shape model, even if facet is not within searchRadius "
options.addOption(Option.builder("allFacets")
.desc("Report values for all facets in OBJ shape model, even if facet is not within searchRadius "
+ "of any points. Prints NaN if facet not within searchRadius. Default is to only "
+ "print facets which have contributions from input points.")
.build());
options.addOption(
Option.builder("info")
options.addOption(Option.builder("info")
.required()
.hasArg()
.desc(
"Required. Name of CSV file containing value to plot."
+ " Default format is lon, lat, radius, value. See -xyz and -llOnly options for alternate formats.")
.build());
options.addOption(
Option.builder("llOnly").desc("Format of -info file is lon, lat, value.").build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("llOnly")
.desc("Format of -info file is lon, lat, value.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("normalize")
.desc(
"Report values per unit area (divide by total area of facets within search ellipse).")
options.addOption(Option.builder("normalize")
.desc("Report values per unit area (divide by total area of facets within search ellipse).")
.build());
options.addOption(
Option.builder("noWeight")
options.addOption(Option.builder("noWeight")
.desc("Do not weight points by distance from facet/vertex.")
.build());
options.addOption(
Option.builder("obj")
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Required. Name of shape model to read.")
.build());
options.addOption(
Option.builder("outFile")
options.addOption(Option.builder("outFile")
.hasArg()
.desc("Specify output file to store the output.")
.build());
options.addOption(
Option.builder("searchRadius")
options.addOption(Option.builder("searchRadius")
.hasArg()
.desc(
"Each facet will be colored using a weighted average of all points within searchRadius of the facet/vertex. "
+ "If not present, set to sqrt(2)/2 * mean facet edge length.")
.build());
options.addOption(
Option.builder("writeVertices")
.desc(
"Convert output from a per facet to per vertex format. Each line will be of the form"
options.addOption(Option.builder("writeVertices")
.desc("Convert output from a per facet to per vertex format. Each line will be of the form"
+ " x, y, z, value, sigma where x, y, z are the vector components of vertex V. "
+ " Default is to only report facetID, facet_value, facet_sigma.")
.build());
options.addOption(
Option.builder("xyz").desc("Format of -info file is x, y, z, value.").build());
options.addOption(Option.builder("xyz")
.desc("Format of -info file is x, y, z, value.")
.build());
return options;
}
@@ -449,8 +434,7 @@ public class ColorSpots implements TerrasaurTool {
ColorSpots cs = new ColorSpots(polyData);
ArrayList<double[]> infoValues = cs.readCSV(cl.getOptionValue("info"), format);
TreeMap<Long, DescriptiveStatistics> statMap =
cs.getStatsFast(infoValues, radius, weight, writeVerts);
TreeMap<Long, DescriptiveStatistics> statMap = cs.getStatsFast(infoValues, radius, weight, writeVerts);
double totalArea = 0;
if (normalize) {
@@ -461,8 +445,7 @@ public class ColorSpots implements TerrasaurTool {
double[] pt1 = points.GetPoint(1);
double[] pt2 = points.GetPoint(2);
TriangularFacet tf =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
TriangularFacet tf = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = tf.getArea();
totalArea += area;
@@ -473,7 +456,8 @@ public class ColorSpots implements TerrasaurTool {
ArrayList<FIELD> fields = new ArrayList<>();
if (cl.hasOption("additionalFields")) {
for (String s : cl.getOptionValue("additionalFields").trim().toUpperCase().split(",")) {
for (String s :
cl.getOptionValue("additionalFields").trim().toUpperCase().split(",")) {
for (FIELD f : FIELD.values()) {
if (f.name().equalsIgnoreCase(s)) fields.add(f);
}
@@ -533,10 +517,7 @@ public class ColorSpots implements TerrasaurTool {
}
private static ArrayList<String> writeFacets(
TreeMap<Long, ArrayList<Double>> map,
boolean allFacets,
boolean normalize,
double totalArea) {
TreeMap<Long, ArrayList<Double>> map, boolean allFacets, boolean normalize, double totalArea) {
ArrayList<String> returnList = new ArrayList<>();
@@ -574,8 +555,7 @@ public class ColorSpots implements TerrasaurTool {
if (allFacets || !value.isNaN()) {
// get vertex x,y,z values
polyData.GetPoint(vertex, thisPt);
StringBuilder sb =
new StringBuilder(
StringBuilder sb = new StringBuilder(
String.format("%e, %e, %e, %e, %e", thisPt[0], thisPt[1], thisPt[2], value, sigma));
for (int i = 2; i < values.size(); i++) {
value = values.get(i);

View File

@@ -157,8 +157,7 @@ public class CompareOBJ implements TerrasaurTool {
* @param useOverlappingPoints if true, only points overlapping the reference model will be added
* to the track file.
*/
private void convertPolyDataToTrackFormat(
vtkPolyData polydata, String filename, boolean useOverlappingPoints) {
private void convertPolyDataToTrackFormat(vtkPolyData polydata, String filename, boolean useOverlappingPoints) {
vtkPoints polyDataPoints = polydata.GetPoints();
vtkPoints points = polyDataPoints;
@@ -172,9 +171,7 @@ public class CompareOBJ implements TerrasaurTool {
double[] p = new double[3];
for (int i = 0; i < points.GetNumberOfPoints(); ++i) {
points.GetPoint(i, p);
pw.write(
String.format(
"2010-11-11T00:00:00.000 "
pw.write(String.format("2010-11-11T00:00:00.000 "
+ p[0]
+ " "
+ p[1]
@@ -208,9 +205,7 @@ public class CompareOBJ implements TerrasaurTool {
int maxNumberOfControlPoints,
boolean useOverlappingPoints,
String transformationFile) {
File tmpDir =
new File(
String.format("%s%sCompareOBJ-%d", tmpdir, File.separator, System.currentTimeMillis()));
File tmpDir = new File(String.format("%s%sCompareOBJ-%d", tmpdir, File.separator, System.currentTimeMillis()));
if (!tmpDir.exists()) tmpDir.mkdirs();
String trackFile = tmpDir + File.separator + "shapemodel-as-track.txt";
@@ -223,8 +218,7 @@ public class CompareOBJ implements TerrasaurTool {
else if (computeOptimalRotation) transformationType = "--rotation-only ";
File lsk = ResourceUtils.writeResourceToFile("/resources/kernels/lsk/naif0012.tls");
String command =
"lidar-optimize --max-number-control-points "
String command = "lidar-optimize --max-number-control-points "
+ maxNumberOfControlPoints
+ " "
+ transformationType
@@ -249,8 +243,7 @@ public class CompareOBJ implements TerrasaurTool {
// save a copy of the JSON transformation file
if (transformationFile != null) {
try (PrintWriter pw = new PrintWriter(transformationFile)) {
List<String> lines =
FileUtils.readLines(new File(tmpTransformationFile), Charset.defaultCharset());
List<String> lines = FileUtils.readLines(new File(tmpTransformationFile), Charset.defaultCharset());
for (String line : lines) pw.println(line);
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
@@ -436,8 +429,7 @@ public class CompareOBJ implements TerrasaurTool {
sumDiff *= radius;
System.out.printf(
"Sum normalized difference in horizontal distances (%d points used): %f meters\n",
npts, sumDiff);
"Sum normalized difference in horizontal distances (%d points used): %f meters\n", npts, sumDiff);
}
/**
@@ -520,8 +512,7 @@ public class CompareOBJ implements TerrasaurTool {
pointLocator.FindPointsWithinRadius(radius, p.toArray(), idList);
if (idList.GetNumberOfIds() < 3) {
logger.error(
String.format(
logger.error(String.format(
"point %d (%f %f %f): %d points within %f, using radial vector to find intersection",
i, p.getX(), p.getY(), p.getZ(), idList.GetNumberOfIds(), radius));
normal = p.normalize();
@@ -547,8 +538,7 @@ public class CompareOBJ implements TerrasaurTool {
}
}
Optional<Vector3D> normalPoint =
findIntersectPointInNormalDirection(p, normal, smallBodyTruth);
Optional<Vector3D> normalPoint = findIntersectPointInNormalDirection(p, normal, smallBodyTruth);
DistanceContainer dc = null;
// Skip this plate in the error calculation if there is no intersection
@@ -569,8 +559,7 @@ public class CompareOBJ implements TerrasaurTool {
if (saveIndex) {
long closestFacet = smallBodyTruth.findClosestCell(p.toArray());
outClosestIndices.write(
String.format("%d, %d, %f\n", i, closestFacet, closestDistance));
outClosestIndices.write(String.format("%d, %d, %f\n", i, closestFacet, closestDistance));
}
}
@@ -630,39 +619,33 @@ public class CompareOBJ implements TerrasaurTool {
}
minDist = distanceContainerVector.get(0).getClosestDistance();
maxDist =
distanceContainerVector.get(distanceContainerVector.size() - 1).getClosestDistance();
maxDist = distanceContainerVector
.get(distanceContainerVector.size() - 1)
.getClosestDistance();
}
Vector3D translation = transform.getTranslation();
String tmpString =
String.format(
"%16.8e,%16.8e,%16.8e", translation.getX(), translation.getY(), translation.getZ());
String.format("%16.8e,%16.8e,%16.8e", translation.getX(), translation.getY(), translation.getZ());
System.out.println("Translation: " + tmpString.replaceAll("\\s+", ""));
Rotation rotation = transform.getRotation();
Rotation inverse =
rotation.composeInverse(Rotation.IDENTITY, RotationConvention.FRAME_TRANSFORM);
Quaternion q =
new Quaternion(inverse.getQ0(), inverse.getQ1(), inverse.getQ2(), inverse.getQ3())
Rotation inverse = rotation.composeInverse(Rotation.IDENTITY, RotationConvention.FRAME_TRANSFORM);
Quaternion q = new Quaternion(inverse.getQ0(), inverse.getQ1(), inverse.getQ2(), inverse.getQ3())
.getPositivePolarForm();
tmpString =
String.format("%16.8e,%16.8e,%16.8e,%16.8e", q.getQ0(), q.getQ1(), q.getQ2(), q.getQ3());
tmpString = String.format("%16.8e,%16.8e,%16.8e,%16.8e", q.getQ0(), q.getQ1(), q.getQ2(), q.getQ3());
System.out.println("Rotation quaternion: " + tmpString.replaceAll("\\s+", ""));
Vector3D axis = inverse.getAxis(RotationConvention.FRAME_TRANSFORM);
tmpString =
String.format(
tmpString = String.format(
"%16.8e,%16.8e,%16.8e,%16.8e",
Math.toDegrees(inverse.getAngle()), axis.getX(), axis.getY(), axis.getZ());
System.out.println("Rotation angle (degrees) and axis: " + tmpString.replaceAll("\\s+", ""));
Vector3D centerOfRotation = transform.getCenterOfRotation();
tmpString =
String.format(
"%16.8e,%16.8e,%16.8e",
centerOfRotation.getX(), centerOfRotation.getY(), centerOfRotation.getZ());
tmpString = String.format(
"%16.8e,%16.8e,%16.8e", centerOfRotation.getX(), centerOfRotation.getY(), centerOfRotation.getZ());
System.out.println("Center of rotation: " + tmpString.replaceAll("\\s+", ""));
double[][] rotMatrix = inverse.getMatrix();
@@ -700,30 +683,25 @@ public class CompareOBJ implements TerrasaurTool {
Vector3D unitNormal = normal.normalize();
org.apache.commons.math3.geometry.euclidean.threed.Plane p =
new org.apache.commons.math3.geometry.euclidean.threed.Plane(
Vector3D.ZERO, unitNormal, 1e-6);
new org.apache.commons.math3.geometry.euclidean.threed.Plane(Vector3D.ZERO, unitNormal, 1e-6);
Vector3D parallelVector = p.toSpace(p.toSubSpace(translation));
System.out.println(
"Direction perpendicular to plane: "
System.out.println("Direction perpendicular to plane: "
+ unitNormal.getX()
+ " "
+ unitNormal.getY()
+ " "
+ unitNormal.getZ());
System.out.println(
"Magnitude of projection perpendicular to plane: "
+ translation.dotProduct(unitNormal));
System.out.println(
"Projection vector of translation parallel to plane: "
"Magnitude of projection perpendicular to plane: " + translation.dotProduct(unitNormal));
System.out.println("Projection vector of translation parallel to plane: "
+ parallelVector.getX()
+ " "
+ parallelVector.getY()
+ " "
+ parallelVector.getZ());
System.out.println(
"Magnitude of projection vector of translation parallel to plane: "
+ parallelVector.getNorm());
"Magnitude of projection vector of translation parallel to plane: " + parallelVector.getNorm());
/*- SPICE
@@ -752,8 +730,7 @@ public class CompareOBJ implements TerrasaurTool {
}
System.out.println(
numPlatesActuallyUsed
System.out.println(numPlatesActuallyUsed
+ " plates used in error calculation out of "
+ polyDataModel.GetNumberOfCells()
+ " total in the shape model");
@@ -777,14 +754,13 @@ public class CompareOBJ implements TerrasaurTool {
// First do the intersection from "below"
Vector3D startBottom = in.subtract(normal.scalarMultiply(size));
double[] out1 = new double[3];
long cellId1 =
smallBodyModel.computeRayIntersection(startBottom.toArray(), normal.toArray(), out1);
long cellId1 = smallBodyModel.computeRayIntersection(startBottom.toArray(), normal.toArray(), out1);
// Then do the intersection from on "top"
Vector3D startTop = in.add(normal.scalarMultiply(size));
double[] out2 = new double[3];
long cellId2 =
smallBodyModel.computeRayIntersection(startTop.toArray(), normal.negate().toArray(), out2);
long cellId2 = smallBodyModel.computeRayIntersection(
startTop.toArray(), normal.negate().toArray(), out2);
if (cellId1 >= 0 && cellId2 >= 0) {
Vector3D out1V = new Vector3D(out1);
@@ -793,8 +769,7 @@ public class CompareOBJ implements TerrasaurTool {
Vector3D inSubOut1 = in.subtract(out1V);
Vector3D inSubOut2 = in.subtract(out2V);
// If both intersected, take the closest
if (inSubOut1.dotProduct(inSubOut1) < inSubOut2.dotProduct(inSubOut2))
return Optional.of(out1V);
if (inSubOut1.dotProduct(inSubOut1) < inSubOut2.dotProduct(inSubOut2)) return Optional.of(out1V);
else return Optional.of(out2V);
}
if (cellId1 >= 0) return Optional.of(new Vector3D(out1));
@@ -806,19 +781,16 @@ public class CompareOBJ implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.argName("path")
.desc("If present, save screen output to <path>.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
@@ -830,7 +802,8 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("-model is transformed to -reference using this optimal ");
sb.append("rotation. This results in an error unbiased by a possible rotation between ");
sb.append("the two models.");
options.addOption(Option.builder("computeOptimalRotation").desc(sb.toString()).build());
options.addOption(
Option.builder("computeOptimalRotation").desc(sb.toString()).build());
sb = new StringBuilder();
sb.append("If specified, the program first computes the optimal translation of ");
@@ -839,7 +812,8 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("is transformed to the -reference using this optimal translation. ");
sb.append("This results in an error unbiased by a possible translation offset between ");
sb.append("the two models.");
options.addOption(Option.builder("computeOptimalTranslation").desc(sb.toString()).build());
options.addOption(
Option.builder("computeOptimalTranslation").desc(sb.toString()).build());
sb = new StringBuilder();
sb.append("If specified, the program first computes the optimal translation and ");
@@ -848,16 +822,16 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("-model is transformed to -reference using this optimal ");
sb.append("translation and rotation. This results in an error unbiased by a possible ");
sb.append("translation offset or rotation between the two models.");
options.addOption(
Option.builder("computeOptimalRotationAndTranslation").desc(sb.toString()).build());
options.addOption(Option.builder("computeOptimalRotationAndTranslation")
.desc(sb.toString())
.build());
sb = new StringBuilder();
sb.append("If specified, the program computes the error in the vertical direction ");
sb.append("(by fitting a plane to -model) and saves it to <path>. This option ");
sb.append("only produces meaningful results for digital terrain models to which a plane ");
sb.append("can be fit.");
options.addOption(
Option.builder("computeVerticalError")
options.addOption(Option.builder("computeVerticalError")
.hasArg()
.argName("path")
.desc(sb.toString())
@@ -869,42 +843,47 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("present, only points within a distance of radius will be used to construct the ");
sb.append("plane. Recommended value is ~5% of the body radius for a global model. Units ");
sb.append("are units of the shape model.");
options.addOption(Option.builder("fitPlaneRadius").hasArg().desc(sb.toString()).build());
options.addOption(
Option.builder("fitPlaneRadius").hasArg().desc(sb.toString()).build());
sb = new StringBuilder();
sb.append("Limit the distances (described in -savePlateDiff) used in calculating the mean ");
sb.append("distance and RMS distance to the closest fraction of all distances.");
options.addOption(Option.builder("limitClosestPoints").hasArg().desc(sb.toString()).build());
options.addOption(Option.builder("limitClosestPoints")
.hasArg()
.desc(sb.toString())
.build());
sb = new StringBuilder();
sb.append("Max number of control points to use in optimization. Default is 2000.");
options.addOption(
Option.builder("maxNumberControlPoints").hasArg().desc(sb.toString()).build());
options.addOption(Option.builder("maxNumberControlPoints")
.hasArg()
.desc(sb.toString())
.build());
options.addOption(
Option.builder("model")
options.addOption(Option.builder("model")
.required()
.hasArg()
.argName("path")
.desc(
"Required. Point cloud/shape file to compare to reference shape. Valid formats are "
.desc("Required. Point cloud/shape file to compare to reference shape. Valid formats are "
+ "anything that can be read by the PointCloudFormatConverter.")
.build());
options.addOption(
Option.builder("reference")
options.addOption(Option.builder("reference")
.required()
.hasArg()
.argName("path")
.desc(
"Required. Reference shape file. Valid formats are FITS, ICQ, OBJ, PLT, PLY, or VTK.")
.desc("Required. Reference shape file. Valid formats are FITS, ICQ, OBJ, PLT, PLY, or VTK.")
.build());
sb = new StringBuilder();
sb.append("Save the rotated and/or translated -model to <path>. ");
sb.append("This option requires one of -computeOptimalRotation, -computeOptimalTranslation ");
sb.append("or -computeOptimalRotationAndTranslation to be specified.");
options.addOption(
Option.builder("saveOptimalShape").hasArg().argName("path").desc(sb.toString()).build());
options.addOption(Option.builder("saveOptimalShape")
.hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder();
sb.append("Save to a file specified by <path> the distances (in same units as the shape ");
@@ -912,8 +891,11 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("-reference. The number of lines in the file equals the number of plates in ");
sb.append("the first shape model with each line containing the distance of that plate ");
sb.append("to the closest point in the second shape model.");
options.addOption(
Option.builder("savePlateDiff").hasArg().argName("path").desc(sb.toString()).build());
options.addOption(Option.builder("savePlateDiff")
.hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder();
sb.append("Save to a file specified by <path> the index of the closest plate in ");
@@ -921,13 +903,15 @@ public class CompareOBJ implements TerrasaurTool {
sb.append("The format of each line is ");
sb.append("\n\tplate index, closest reference plate index, distance\n");
sb.append("Only valid when -model is a shape model with facet information.");
options.addOption(
Option.builder("savePlateIndex").hasArg().argName("path").desc(sb.toString()).build());
options.addOption(Option.builder("savePlateIndex")
.hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder();
sb.append("Save output of lidar-optimize to <path> (JSON format file)");
options.addOption(
Option.builder("saveTransformationFile")
options.addOption(Option.builder("saveTransformationFile")
.hasArg()
.argName("path")
.desc(sb.toString())
@@ -936,8 +920,11 @@ public class CompareOBJ implements TerrasaurTool {
sb = new StringBuilder();
sb.append("Directory to store temporary files. It will be created if it does not exist. ");
sb.append("Default is the current working directory.");
options.addOption(
Option.builder("tmpDir").hasArg().argName("path").desc(sb.toString()).build());
options.addOption(Option.builder("tmpDir")
.hasArg()
.argName("path")
.desc(sb.toString())
.build());
sb = new StringBuilder();
sb.append("Use all points in -model when attempting to fit to ");
@@ -960,34 +947,25 @@ public class CompareOBJ implements TerrasaurTool {
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
double limitClosestPoints =
cl.hasOption("limitClosestPoints")
? Double.parseDouble(cl.getOptionValue("limitClosestPoints"))
: 1.0;
cl.hasOption("limitClosestPoints") ? Double.parseDouble(cl.getOptionValue("limitClosestPoints")) : 1.0;
limitClosestPoints = Math.max(0, Math.min(1, limitClosestPoints));
boolean saveOptimalShape = cl.hasOption("saveOptimalShape");
boolean computeOptimalTranslation =
cl.hasOption("computeOptimalRotationAndTranslation")
|| cl.hasOption("computeOptimalTranslation");
cl.hasOption("computeOptimalRotationAndTranslation") || cl.hasOption("computeOptimalTranslation");
boolean computeOptimalRotation =
cl.hasOption("computeOptimalRotationAndTranslation")
|| cl.hasOption("computeOptimalRotation");
cl.hasOption("computeOptimalRotationAndTranslation") || cl.hasOption("computeOptimalRotation");
final boolean computeHorizontalError = false;
boolean useOverlappingPoints = !cl.hasOption("useAllPoints");
double planeRadius =
cl.hasOption("fitPlaneRadius")
? Double.parseDouble(cl.getOptionValue("fitPlaneRadius"))
: 0;
int maxNumberOfControlPoints =
cl.hasOption("maxNumberControlPoints")
cl.hasOption("fitPlaneRadius") ? Double.parseDouble(cl.getOptionValue("fitPlaneRadius")) : 0;
int maxNumberOfControlPoints = cl.hasOption("maxNumberControlPoints")
? Integer.parseInt(cl.getOptionValue("maxNumberControlPoints"))
: 2000;
int npoints = -1;
double size = 0;
String closestDiffFile =
cl.hasOption("savePlateDiff") ? cl.getOptionValue("savePlateDiff") : null;
String closestIndexFile =
cl.hasOption("savePlateIndex") ? cl.getOptionValue("savePlateIndex") : null;
String closestDiffFile = cl.hasOption("savePlateDiff") ? cl.getOptionValue("savePlateDiff") : null;
String closestIndexFile = cl.hasOption("savePlateIndex") ? cl.getOptionValue("savePlateIndex") : null;
String optimalShapeFile = saveOptimalShape ? cl.getOptionValue("saveOptimalShape") : null;
String verticalDiffFile =
cl.hasOption("computeVerticalError") ? cl.getOptionValue("computeVerticalError") : null;

View File

@@ -60,8 +60,7 @@ public class CreateSBMTStructure implements TerrasaurTool {
@Override
public String fullDescription(Options options) {
String header =
"This tool creates an SBMT ellipse file from a set of point on an image.";
String header = "This tool creates an SBMT ellipse file from a set of points on an image.";
String footer = "";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
@@ -75,8 +74,7 @@ public class CreateSBMTStructure implements TerrasaurTool {
* @param p3 Third point
* @return An SBMT structure describing the ellipse
*/
private static SBMTEllipseRecord createRecord(
int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
private static SBMTEllipseRecord createRecord(int id, String name, Vector3D p1, Vector3D p2, Vector3D p3) {
// Create a local coordinate system where X axis contains long axis and Y axis contains short
// axis
@@ -104,8 +102,7 @@ public class CreateSBMTStructure implements TerrasaurTool {
double rotation = Math.atan2(b.getY() - a.getY(), b.getX() - a.getX());
double flattening = (majorAxis - minorAxis) / majorAxis;
ImmutableSBMTEllipseRecord.Builder record =
ImmutableSBMTEllipseRecord.builder()
ImmutableSBMTEllipseRecord.Builder record = ImmutableSBMTEllipseRecord.builder()
.id(id)
.name(name)
.x(origin.getX())
@@ -129,43 +126,70 @@ public class CreateSBMTStructure implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("input")
options.addOption(Option.builder("flipX")
.desc("If present, negate the X coordinate of the input points.")
.build());
options.addOption(Option.builder("flipY")
.desc("If present, negate the Y coordinate of the input points.")
.build());
options.addOption(Option.builder("input")
.required()
.hasArg()
.desc(
"""
Required. Name or input file. This is a text file with a pair of pixel coordinates per line. The pixel
Required. Name or input file. This is a text file with a pair of pixel coordinates (X and Y) per line. The pixel
coordinates are offsets from the image center. For example:
# My test file
627.51274 876.11775
630.53612 883.55992
626.3499 881.46681
89.6628 285.01
97.8027 280.126
95.0119 285.01
-13.8299 323.616
-1.9689 331.756
-11.7367 330.826
Empty lines or lines beginning with # are ignored.
Each set of three points are used to create the SBMT structures. The first two points are the long
axis and the third is a location for the semi-minor axis.""")
.build());
options.addOption(
Option.builder("objFile")
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(
Option.builder("output")
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(
Option.builder("sumFile")
options.addOption(Option.builder("spice")
.hasArg()
.desc(
"If present, name of metakernel to read. Other required options with -spice are -date, -observer, -target, and -cameraFrame.")
.build());
options.addOption(Option.builder("date")
.hasArgs()
.desc("Only used with -spice. Date of image (e.g. 2022 SEP 26 23:11:12.649).")
.build());
options.addOption(Option.builder("observer")
.hasArg()
.desc("Only used with -spice. Observing body (e.g. DART)")
.build());
options.addOption(Option.builder("target")
.hasArg()
.desc("Only used with -spice. Target body (e.g. DIMORPHOS).")
.build());
options.addOption(Option.builder("cameraFrame")
.hasArg()
.desc("Only used with -spice. Camera frame (e.g. DART_DRACO).")
.build());
options.addOption(Option.builder("sumFile")
.required()
.hasArg()
.desc("Required. Name of sum file to read.")
.desc(
"Required. Name of sum file to read. This is still required with -spice, but only used as a template to create a new sum file.")
.build());
return options;
}
@@ -178,8 +202,7 @@ axis and the third is a location for the semi-minor axis.""")
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadSpiceLibraries();
NativeLibraryLoader.loadVtkLibraries();
@@ -195,15 +218,20 @@ axis and the third is a location for the semi-minor axis.""")
}
RangeFromSumFile rfsf = new RangeFromSumFile(sumFile, polyData);
boolean flipX = cl.hasOption("flipX");
boolean flipY = cl.hasOption("flipY");
List<Vector3D> intercepts = new ArrayList<>();
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
for (String line :
lines.stream().filter(s -> !(s.isBlank() || s.strip().startsWith("#"))).toList()) {
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("input")), Charset.defaultCharset());
for (String line : lines.stream()
.filter(s -> !(s.isBlank() || s.strip().startsWith("#")))
.toList()) {
String[] parts = line.split("\\s+");
int ix = (int) Math.round(Double.parseDouble(parts[0]));
int iy = (int) Math.round(Double.parseDouble(parts[1]));
if (flipX) ix *= -1;
if (flipY) iy *= -1;
Map.Entry<Long, Vector3D> entry = rfsf.findIntercept(ix, iy);
long cellID = entry.getKey();
if (cellID > -1) intercepts.add(entry.getValue());

View File

@@ -59,39 +59,35 @@ public class DSK2OBJ implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("body")
options.addOption(Option.builder("body")
.hasArg()
.desc(
"If present, convert shape for named body. Default is to use the first body in the DSK.")
.desc("If present, convert shape for named body. Default is to use the first body in the DSK.")
.build());
options.addOption(
Option.builder("dsk").hasArg().required().desc("Required. Name of input DSK.").build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("dsk")
.hasArg()
.required()
.desc("Required. Name of input DSK.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
options.addOption(
Option.builder("printBodies")
Option.builder("obj").hasArg().desc("Name of output OBJ.").build());
options.addOption(Option.builder("printBodies")
.desc("If present, print bodies and surface ids in DSK.")
.build());
options.addOption(
Option.builder("surface")
options.addOption(Option.builder("surface")
.hasArg()
.desc(
"If present, use specified surface id. Default is to use the first surface id for the body.")
.desc("If present, use specified surface id. Default is to use the first surface id for the body.")
.build());
return options;
}
@@ -147,8 +143,7 @@ public class DSK2OBJ implements TerrasaurTool {
}
Surface[] surfaces = dsk.getSurfaces(b);
Surface s =
cl.hasOption("surface")
Surface s = cl.hasOption("surface")
? new Surface(Integer.parseInt(cl.getOptionValue("surface")), b)
: surfaces[0];
boolean missingSurface = true;
@@ -159,10 +154,8 @@ public class DSK2OBJ implements TerrasaurTool {
}
}
if (missingSurface) {
logger.warn(
String.format(
"Surface %d for body %s not found in DSK! Valid surfaces are:",
s.getIDCode(), b.getName()));
logger.warn(String.format(
"Surface %d for body %s not found in DSK! Valid surfaces are:", s.getIDCode(), b.getName()));
for (Surface surface : surfaces) logger.warn(Integer.toString(surface.getIDCode()));
System.exit(0);
}
@@ -191,8 +184,7 @@ public class DSK2OBJ implements TerrasaurTool {
for (int[] p : plates) {
pw.printf("f %d %d %d\r\n", p[0], p[1], p[2]);
}
logger.info(
String.format(
logger.info(String.format(
"Wrote %d vertices and %d plates to %s for body %d surface %d",
nv[0],
np[0],

View File

@@ -89,7 +89,6 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
private DifferentialVolumeEstimator() {}
@Override
public String shortDescription() {
return "Find volume difference between two shape models.";
@@ -102,15 +101,13 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
public String fullDescription(Options options) {
String header = "";
String footer = "\nThis program finds the volume difference between a shape model and a reference surface. "
+"The reference surface can either be another shape model or a degree "+POLYNOMIAL_DEGREE+" fit to a set of supplied points. "+
"A local coordinate system is derived from the reference surface. The heights of the shape and reference at "+
"each grid point are reported. ";
+ "The reference surface can either be another shape model or a degree " + POLYNOMIAL_DEGREE
+ " fit to a set of supplied points. "
+ "A local coordinate system is derived from the reference surface. The heights of the shape and reference at "
+ "each grid point are reported. ";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
/** input shape model */
private vtkPolyData globalPolyData;
/** shape model in native coordinates */
@@ -169,7 +166,10 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
// the origin of the local coordinate system, in global coordinates
private enum ORIGIN {
MIN_HEIGHT, MAX_HEIGHT, CUSTOM, DEFAULT
MIN_HEIGHT,
MAX_HEIGHT,
CUSTOM,
DEFAULT
}
public Vector3D nativeToLocal(Vector3D nativeIJK) {
@@ -211,8 +211,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
Path2D.Double outline = new Path2D.Double();
for (int i = 0; i < points.size(); i++) {
Vector3D nativeIJK = plane.globalToLocal(points.get(i));
Vector3D localIJK =
nativeToLocal.getKey().applyTo(nativeIJK.subtract(nativeToLocal.getValue()));
Vector3D localIJK = nativeToLocal.getKey().applyTo(nativeIJK.subtract(nativeToLocal.getValue()));
if (i == 0) {
outline.moveTo(localIJK.getX(), localIJK.getY());
} else {
@@ -251,11 +250,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
direction[2] = -1;
cellID = nativeSBM.computeRayIntersection(origin, direction, intersect);
}
if (cellID >= 0)
height = direction[2] * new Vector3D(origin).distance(new Vector3D(intersect));
if (cellID >= 0) height = direction[2] * new Vector3D(origin).distance(new Vector3D(intersect));
if (Double.isNaN(height))
return Double.NaN;
if (Double.isNaN(height)) return Double.NaN;
return height;
}
@@ -289,8 +286,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
refHeight = referenceSurface.value(x, y);
}
if (Double.isNaN(refHeight))
return Double.NaN;
if (Double.isNaN(refHeight)) return Double.NaN;
return refHeight;
}
@@ -308,12 +304,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
for (double x = 0; x <= gridHalfExtent; x += gridSpacing) {
for (double y = 0; y <= gridHalfExtent; y += gridSpacing) {
localGrid.add(new Vector3D(x, y, 0));
if (y != 0)
localGrid.add(new Vector3D(x, -y, 0));
if (x != 0)
localGrid.add(new Vector3D(-x, y, 0));
if (x != 0 && y != 0)
localGrid.add(new Vector3D(-x, -y, 0));
if (y != 0) localGrid.add(new Vector3D(x, -y, 0));
if (x != 0) localGrid.add(new Vector3D(-x, y, 0));
if (x != 0 && y != 0) localGrid.add(new Vector3D(-x, -y, 0));
}
}
this.gridSpacing = gridSpacing;
@@ -326,17 +319,13 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
GridPoint gp = new GridPoint(localPoint);
gridPoints.add(gp);
if (Double.isFinite(gp.height)) {
if (highGridPoint == null || highGridPoint.height < gp.height)
highGridPoint = gp;
if (lowGridPoint == null || lowGridPoint.height > gp.height)
lowGridPoint = gp;
if (highGridPoint == null || highGridPoint.height < gp.height) highGridPoint = gp;
if (lowGridPoint == null || lowGridPoint.height > gp.height) lowGridPoint = gp;
}
}
if (highGridPoint != null)
highPoint = highGridPoint.globalIJK;
if (lowGridPoint != null)
lowPoint = lowGridPoint.globalIJK;
if (highGridPoint != null) highPoint = highGridPoint.globalIJK;
if (lowGridPoint != null) lowPoint = lowGridPoint.globalIJK;
return gridPoints;
}
@@ -374,8 +363,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
// flip the plane
Pair<Rotation, Vector3D> transform = plane.getTransform();
Vector3D planeNormal = transform.getKey().applyInverseTo(referenceNormal);
if (planeNormal.dotProduct(referenceNormal) < 0)
plane = plane.reverseNormal();
if (planeNormal.dotProduct(referenceNormal) < 0) plane = plane.reverseNormal();
// create the SmallBodyModel for the shape to evaluate
vtkPolyData nativePolyData = new vtkPolyData();
@@ -431,13 +419,19 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
Vector3D translateNativeToLocalInGlobalCoordinates =
localOriginInGlobalCoordinates.subtract(nativeOriginInGlobalCoordinates);
// TODO: check that the Z component is zero (it should be?)
translateNativeToLocal =
globalToLocalTransform.getKey().applyTo(translateNativeToLocalInGlobalCoordinates);
translateNativeToLocal = globalToLocalTransform.getKey().applyTo(translateNativeToLocalInGlobalCoordinates);
}
Rotation rotateNativeToLocal =
MathConversions.toRotation(new RotationMatrixIJK(iRow.getX(), jRow.getX(), kRow.getX(),
iRow.getY(), jRow.getY(), kRow.getY(), iRow.getZ(), jRow.getZ(), kRow.getZ()));
Rotation rotateNativeToLocal = MathConversions.toRotation(new RotationMatrixIJK(
iRow.getX(),
jRow.getX(),
kRow.getX(),
iRow.getY(),
jRow.getY(),
kRow.getY(),
iRow.getZ(),
jRow.getZ(),
kRow.getZ()));
this.nativeToLocal = new AbstractMap.SimpleEntry<>(rotateNativeToLocal, translateNativeToLocal);
}
@@ -454,8 +448,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
sb.append("# Local X and Y are grid coordinates in the local reference frame\n");
sb.append("# Angle is measured from the local X axis, in degrees\n");
sb.append("# ROI flag is 1 if point is in the region of interest, 0 if not\n");
sb.append("# Global X, Y, and Z are the local grid points in the global "
+ " (input) reference system\n");
sb.append("# Global X, Y, and Z are the local grid points in the global " + " (input) reference system\n");
sb.append("# Reference Height is the height of the reference model (or fit surface) above "
+ "the local grid plane\n");
sb.append("# Model Height is the height of the shape model above the local grid plane. "
@@ -476,7 +469,6 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
return sb.toString();
}
/**
* The header for the sector CSV file. Each line begins with a #
*
@@ -507,8 +499,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
sb.append(String.format("%f, ", gp.localIJK.getY()));
double angle = Math.toDegrees(Math.atan2(gp.localIJK.getY(), gp.localIJK.getX()));
if (angle < 0)
angle += 360;
if (angle < 0) angle += 360;
sb.append(String.format("%f, ", angle));
sb.append(String.format("%d, ", isInsideROI(gp.localIJK) ? 1 : 0));
@@ -535,8 +526,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
Point2D thisPoint = new Point2D.Double(localIJ.getX(), localIJ.getY());
boolean insideROI = roiOuter == null || roiOuter.contains(thisPoint);
if (roiInner != null) {
if (insideROI && roiInner.contains(thisPoint))
insideROI = false;
if (insideROI && roiInner.contains(thisPoint)) insideROI = false;
}
return insideROI;
}
@@ -549,9 +539,11 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
* @param sectorsMap grid points within each sector
* @param vtkFile file to write
*/
private void writeReferenceVTK(Collection<GridPoint> gridPointsList,
private void writeReferenceVTK(
Collection<GridPoint> gridPointsList,
Map<Integer, Collection<GridPoint>> profilesMap,
Map<Integer, Collection<GridPoint>> sectorsMap, String vtkFile) {
Map<Integer, Collection<GridPoint>> sectorsMap,
String vtkFile) {
Map<Vector3D, Boolean> roiMap = new HashMap<>();
Map<Vector3D, Integer> profileMap = new HashMap<>();
@@ -559,8 +551,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
for (GridPoint gp : gridPointsList) {
Vector3D localIJK = gp.localIJK;
Vector3D nativeIJK =
nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
Vector3D nativeIJK = nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
Vector3D globalIJK = plane.localToGlobal(nativeIJK);
roiMap.put(globalIJK, isInsideROI(gp.localIJK));
profileMap.put(globalIJK, 0);
@@ -639,8 +630,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(csvFile)) {
pw.println(getHeader(header));
for (GridPoint gp : gridPoints)
pw.println(toCSV(gp));
for (GridPoint gp : gridPoints) pw.println(toCSV(gp));
} catch (FileNotFoundException e) {
logger.warn("Can't write " + csvFile);
logger.warn(e.getLocalizedMessage());
@@ -654,12 +644,11 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
* @param header file header
* @param outputBasename CSV file to write
*/
private Map<Integer, Collection<GridPoint>> writeProfileCSV(Collection<GridPoint> gridPoints,
String header, String outputBasename) {
private Map<Integer, Collection<GridPoint>> writeProfileCSV(
Collection<GridPoint> gridPoints, String header, String outputBasename) {
Map<Integer, Collection<GridPoint>> profileMap = new HashMap<>();
if (numProfiles == 0)
return profileMap;
if (numProfiles == 0) return profileMap;
// sort grid points into radial bins
NavigableMap<Integer, Set<GridPoint>> radialMap = new TreeMap<>();
@@ -677,8 +666,8 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
profileMap.put(i, profileGridPoints);
double angle = deltaAngle * i;
String csvFile = String.format("%s_profile_%03d.csv", outputBasename,
(int) Math.round(Math.toDegrees(angle)));
String csvFile =
String.format("%s_profile_%03d.csv", outputBasename, (int) Math.round(Math.toDegrees(angle)));
try (PrintWriter pw = new PrintWriter(csvFile)) {
pw.println(getHeader(header));
@@ -693,11 +682,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
// sort points in this radial bin by angular distance from profile angle
NavigableSet<GridPoint> sortedByAngle = new TreeSet<>((o1, o2) -> {
double angle1 = Math.atan2(o1.localIJK.getY(), o1.localIJK.getX());
if (angle1 < 0)
angle1 += 2 * Math.PI;
if (angle1 < 0) angle1 += 2 * Math.PI;
double angle2 = Math.atan2(o2.localIJK.getY(), o2.localIJK.getX());
if (angle2 < 0)
angle2 += 2 * Math.PI;
if (angle2 < 0) angle2 += 2 * Math.PI;
return Double.compare(Math.abs(angle1 - angle), Math.abs(angle2 - angle));
});
@@ -705,14 +692,12 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
pw.println(toCSV(sortedByAngle.first()));
GridPoint thisPoint = sortedByAngle.first();
if (Double.isFinite(thisPoint.differentialHeight))
profileGridPoints.add(thisPoint);
if (Double.isFinite(thisPoint.differentialHeight)) profileGridPoints.add(thisPoint);
}
} catch (FileNotFoundException e) {
logger.warn("Can't write {}", csvFile);
logger.warn(e.getLocalizedMessage());
}
}
return profileMap;
@@ -725,14 +710,13 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
* @param header file header
* @param outputBasename CSV file to write
*/
private Map<Integer, Collection<GridPoint>> writeSectorCSV(Collection<GridPoint> gridPoints,
String header, String outputBasename) {
private Map<Integer, Collection<GridPoint>> writeSectorCSV(
Collection<GridPoint> gridPoints, String header, String outputBasename) {
// grid points in each sector
Map<Integer, Collection<GridPoint>> sectorMap = new HashMap<>();
if (numProfiles == 0)
return sectorMap;
if (numProfiles == 0) return sectorMap;
String csvFile = outputBasename + "_sector.csv";
@@ -749,8 +733,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
for (GridPoint gp : gridPoints) {
Vector3D localIJK = gp.localIJK;
double azimuth = Math.atan2(localIJK.getY(), localIJK.getX());
if (azimuth < 0)
azimuth += 2 * Math.PI;
if (azimuth < 0) azimuth += 2 * Math.PI;
double key = aboveMap.floorKey(azimuth);
int sector = (int) (key / deltaAngle);
@@ -788,7 +771,6 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
}
return sectorMap;
}
private class GridPoint implements Comparable<GridPoint> {
@@ -805,8 +787,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
*/
public GridPoint(Vector3D xy) {
this.localIJK = xy;
Vector3D nativeIJK =
nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
Vector3D nativeIJK = nativeToLocal.getKey().applyInverseTo(localIJK).add(nativeToLocal.getValue());
globalIJK = plane.localToGlobal(nativeIJK);
referenceHeight = getRefHeight(nativeIJK.getX(), nativeIJK.getY());
height = getHeight(nativeIJK.getX(), nativeIJK.getY());
@@ -819,8 +800,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
@Override
public int compareTo(GridPoint o) {
int compare = Double.compare(localIJK.getX(), o.localIJK.getX());
if (compare == 0)
compare = Double.compare(localIJK.getY(), o.localIJK.getY());
if (compare == 0) compare = Double.compare(localIJK.getY(), o.localIJK.getY());
return compare;
}
}
@@ -840,8 +820,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
} else {
List<String> lines = FileUtils.readLines(new File(filename), Charset.defaultCharset());
for (String line : lines) {
if (line.trim().isEmpty() || line.trim().startsWith("#"))
continue;
if (line.trim().isEmpty() || line.trim().startsWith("#")) continue;
SBMTStructure structure = SBMTStructure.fromString(line);
points.add(structure.centerXYZ());
}
@@ -851,32 +830,46 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
}
return points;
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("gridExtent").required().hasArg().desc(
options.addOption(Option.builder("gridExtent")
.required()
.hasArg()
.desc(
"Required. Size of local grid, in same units as shape model and reference surface. Grid is assumed to be square.")
.build());
options.addOption(Option.builder("gridSpacing").required().hasArg()
.desc(
"Required. Spacing of local grid, in same units as shape model and reference surface.")
options.addOption(Option.builder("gridSpacing")
.required()
.hasArg()
.desc("Required. Spacing of local grid, in same units as shape model and reference surface.")
.build());
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
options.addOption(Option.builder("logLevel").hasArg()
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ "ALL, OFF, SEVERE, WARNING, INFO, CONFIG, FINE, FINER, or FINEST. Default is INFO.")
.build());
options.addOption(Option.builder("numProfiles").hasArg().desc(
"Number of radial profiles to create. Profiles are evenly spaced in degrees and evaluated "
options.addOption(Option.builder("numProfiles")
.hasArg()
.desc("Number of radial profiles to create. Profiles are evenly spaced in degrees and evaluated "
+ "at intervals of gridSpacing in the radial direction.")
.build());
options.addOption(Option.builder("origin").hasArg()
.desc("If present, set origin of local coordinate system. "
options.addOption(Option.builder("origin")
.hasArg()
.desc(
"If present, set origin of local coordinate system. "
+ "Options are MAX_HEIGHT (set to maximum elevation of the shape model), "
+ "MIN_HEIGHT (set to minimum elevation of the shape model), "
+ "or a three element vector specifying the desired origin, comma separated, no spaces (e.g. 11.45,-45.34,0.932).")
.build());
options.addOption(Option.builder("output").hasArg().required().desc(
options.addOption(Option.builder("output")
.hasArg()
.required()
.desc(
"Basename of output files. Files will be named ${output}_grid.csv for the grid, ${output}_sector.csv for the sectors, "
+ "and ${output}_profile_${degrees}.csv for profiles.")
.build());
@@ -884,26 +877,38 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
.desc("Specify +Z direction of local coordinate system to be in the radial "
+ "direction. Default is to align local +Z along global +Z.")
.build());
options.addOption(Option.builder("referenceList").hasArg().desc(
"File containing reference points. If the file extension is .vtk it is read as a VTK file, "
options.addOption(Option.builder("referenceList")
.hasArg()
.desc("File containing reference points. If the file extension is .vtk it is read as a VTK file, "
+ "otherwise it is assumed to be an SBMT structure file.")
.build());
options.addOption(Option.builder("referenceShape").hasArg().desc("Reference shape.").build());
options.addOption(Option.builder("referenceVTK").hasArg()
options.addOption(Option.builder("referenceShape")
.hasArg()
.desc("Reference shape.")
.build());
options.addOption(Option.builder("referenceVTK")
.hasArg()
.desc("If present, write out a VTK file with the reference surface at each grid point. "
+ "If an ROI is defined color points inside/outside the boundaries.")
.build());
options.addOption(Option.builder("roiInner").hasArg().desc(
options.addOption(Option.builder("roiInner")
.hasArg()
.desc(
"Flag points closer to the origin than this as outside the ROI. Supported formats are the same as referenceList.")
.build());
options.addOption(Option.builder("roiOuter").hasArg().desc(
options.addOption(Option.builder("roiOuter")
.hasArg()
.desc(
"Flag points closer to the origin than this as outside the ROI. Supported formats are the same as referenceList.")
.build());
options.addOption(Option.builder("shapeModel").hasArg().required()
.desc("Shape model for volume computation.").build()); return options;
options.addOption(Option.builder("shapeModel")
.hasArg()
.required()
.desc("Shape model for volume computation.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new DifferentialVolumeEstimator();
@@ -917,7 +922,11 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
StringBuilder header = new StringBuilder();
header.append("# ").append(new Date()).append("\n");
header.append("# ").append(defaultOBJ.getClass().getSimpleName()).append(" [").append(AppVersion.getVersionString()).append("]\n");
header.append("# ")
.append(defaultOBJ.getClass().getSimpleName())
.append(" [")
.append(AppVersion.getVersionString())
.append("]\n");
header.append("# ").append(startupMessages.get(MessageLabel.ARGUMENTS)).append("\n");
NativeLibraryLoader.loadVtkLibraries();
@@ -929,8 +938,7 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
String dirName = FilenameUtils.getFullPath(outputBasename);
if (!dirName.trim().isEmpty()) {
File dir = new File(dirName);
if (!dir.exists())
dir.mkdirs();
if (!dir.exists()) dir.mkdirs();
}
vtkPolyData polyData = null;
@@ -948,8 +956,10 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
if (originString.contains(",")) {
String[] parts = originString.split(",");
if (parts.length == 3) {
localOrigin = new Vector3D(Double.parseDouble(parts[0].trim()),
Double.parseDouble(parts[1].trim()), Double.parseDouble(parts[2].trim()));
localOrigin = new Vector3D(
Double.parseDouble(parts[0].trim()),
Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()));
originType = ORIGIN.CUSTOM;
}
} else {
@@ -958,11 +968,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
}
DifferentialVolumeEstimator app = new DifferentialVolumeEstimator(polyData);
if (cl.hasOption("numProfiles"))
app.setNumProfiles(Integer.parseInt(cl.getOptionValue("numProfiles")));
if (cl.hasOption("numProfiles")) app.setNumProfiles(Integer.parseInt(cl.getOptionValue("numProfiles")));
if (cl.hasOption("radialUp"))
app.setRadialUp(true);
if (cl.hasOption("radialUp")) app.setRadialUp(true);
if (cl.hasOption("referenceShape")) {
try {
@@ -994,11 +1002,9 @@ public class DifferentialVolumeEstimator implements TerrasaurTool {
break;
}
if (cl.hasOption("roiInner"))
app.setInnerROI(cl.getOptionValue("roiInner"));
if (cl.hasOption("roiInner")) app.setInnerROI(cl.getOptionValue("roiInner"));
if (cl.hasOption("roiOuter"))
app.setOuterROI(cl.getOptionValue("roiOuter"));
if (cl.hasOption("roiOuter")) app.setOuterROI(cl.getOptionValue("roiOuter"));
NavigableSet<GridPoint> gridPoints = app.createGrid(gridHalfExtent, gridSpacing);

View File

@@ -79,14 +79,12 @@ public class DumpConfig implements TerrasaurTool {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(configBlock);
PropertiesConfigurationLayout layout = config.getLayout();
String now =
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC)
.format(Instant.now());
layout.setHeaderComment(
String.format(
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw);
} catch (ConfigurationException | IOException e) {

View File

@@ -0,0 +1,261 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.io.PrintWriter;
import java.util.Arrays;
import java.util.Map;
import java.util.NavigableSet;
import java.util.TreeSet;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.math3.geometry.euclidean.threed.SphericalCoordinates;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.immutables.value.Value;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.CellInfo;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataStatistics;
import terrasaur.utils.PolyDataUtil;
import vtk.vtkIdList;
import vtk.vtkOBBTree;
import vtk.vtkPolyData;
public class FacetInfo implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
/**
* This doesn't need to be private, or even declared, but you might want to if you have other
* constructors.
*/
private FacetInfo() {}
@Override
public String shortDescription() {
return "Print info about a facet.";
}
@Override
public String fullDescription(Options options) {
String header = "Prints information about facet(s).";
String footer =
"""
This tool prints out facet center, normal, angle between center and
normal, and other information about the specified facet(s).""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private vtkPolyData polyData;
private vtkOBBTree searchTree;
private Vector3D origin;
private FacetInfo(vtkPolyData polyData) {
this.polyData = polyData;
PolyDataStatistics stats = new PolyDataStatistics(polyData);
origin = new Vector3D(stats.getCentroid());
logger.info("Origin is at {}", origin);
logger.info("Creating search tree");
searchTree = new vtkOBBTree();
searchTree.SetDataSet(polyData);
searchTree.SetTolerance(1e-12);
searchTree.BuildLocator();
}
/**
* @param cellId id of this cell
* @return Set of neighboring cells (ones which share a vertex with this one)
*/
private NavigableSet<Long> neighbors(long cellId) {
NavigableSet<Long> neighborCellIds = new TreeSet<>();
vtkIdList vertexIdlist = new vtkIdList();
CellInfo.getCellInfo(polyData, cellId, vertexIdlist);
vtkIdList facetIdlist = new vtkIdList();
for (long i = 0; i < vertexIdlist.GetNumberOfIds(); i++) {
long vertexId = vertexIdlist.GetId(i);
polyData.GetPointCells(vertexId, facetIdlist);
}
for (long i = 0; i < facetIdlist.GetNumberOfIds(); i++) {
long id = facetIdlist.GetId(i);
if (id == cellId) continue;
neighborCellIds.add(id);
}
return neighborCellIds;
}
@Value.Immutable
public abstract static class FacetInfoLine {
public abstract long index();
public abstract Vector3D radius();
public abstract Vector3D normal();
/**
* @return facets between this and origin
*/
public abstract NavigableSet<Long> interiorIntersections();
/**
* @return facets between this and infinity
*/
public abstract NavigableSet<Long> exteriorIntersections();
public static String getHeader() {
return "# Index, "
+ "Center Lat (deg), "
+ "Center Lon (deg), "
+ "Radius, "
+ "Radial Vector, "
+ "Normal Vector, "
+ "Angle between radial and normal (deg), "
+ "facets between this and origin, "
+ "facets between this and infinity";
}
public String toCSV() {
SphericalCoordinates spc = new SphericalCoordinates(radius());
StringBuilder sb = new StringBuilder();
sb.append(String.format("%d, ", index()));
sb.append(String.format("%.4f, ", 90 - Math.toDegrees(spc.getPhi())));
sb.append(String.format("%.4f, ", Math.toDegrees(spc.getTheta())));
sb.append(String.format("%.6f, ", spc.getR()));
sb.append(String.format("%.6f %.6f %.6f, ", radius().getX(), radius().getY(), radius().getZ()));
sb.append(String.format("%.6f %.6f %.6f, ", normal().getX(), normal().getY(), normal().getZ()));
sb.append(String.format("%.3f, ", Math.toDegrees(Vector3D.angle(radius(), normal()))));
sb.append(String.format("%d", interiorIntersections().size()));
if (!interiorIntersections().isEmpty()) {
for (long id : interiorIntersections()) sb.append(String.format(" %d", id));
}
sb.append(", ");
sb.append(String.format("%d", exteriorIntersections().size()));
if (!exteriorIntersections().isEmpty()) {
for (long id : exteriorIntersections()) sb.append(String.format(" %d", id));
}
return sb.toString();
}
}
private FacetInfoLine getFacetInfoLine(long cellId) {
CellInfo ci = CellInfo.getCellInfo(polyData, cellId, new vtkIdList());
vtkIdList cellIds = new vtkIdList();
searchTree.IntersectWithLine(origin.toArray(), ci.center().toArray(), null, cellIds);
// count up all crossings of the surface between the origin and the facet.
NavigableSet<Long> insideIds = new TreeSet<>();
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == cellId) continue;
insideIds.add(cellIds.GetId(j));
}
Vector3D infinity = ci.center().scalarMultiply(1e9);
cellIds = new vtkIdList();
searchTree.IntersectWithLine(infinity.toArray(), ci.center().toArray(), null, cellIds);
// count up all crossings of the surface between the infinity and the facet.
NavigableSet<Long> outsideIds = new TreeSet<>();
for (long j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == cellId) continue;
outsideIds.add(cellIds.GetId(j));
}
return ImmutableFacetInfoLine.builder()
.index(cellId)
.radius(ci.center())
.normal(ci.normal())
.interiorIntersections(insideIds)
.exteriorIntersections(outsideIds)
.build();
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("facet")
.required()
.hasArgs()
.desc("Facet(s) to query. Separate multiple indices with whitespace.")
.build());
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("CSV file to write.")
.build());
return options;
}
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new FacetInfo();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
try {
vtkPolyData polydata = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
FacetInfo app = new FacetInfo(polydata);
try (PrintWriter pw = new PrintWriter(cl.getOptionValue("output"))) {
pw.println(FacetInfoLine.getHeader());
for (long cellId : Arrays.stream(cl.getOptionValues("facet"))
.mapToLong(Long::parseLong)
.toArray()) {
pw.println(app.getFacetInfoLine(cellId).toCSV());
NavigableSet<Long> neighbors = app.neighbors(cellId);
for (long neighborCellId : neighbors)
pw.println(app.getFacetInfoLine(neighborCellId).toCSV());
}
}
logger.info("Wrote {}", cl.getOptionValue("output"));
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
logger.info("Finished.");
}
}

View File

@@ -282,31 +282,19 @@ public class GetSpots implements TerrasaurTool {
Instrument instrument = new Instrument(instID);
FOV fov = new FOV(instrument);
Matrix33 instrToBodyFixed =
fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
Matrix33 instrToBodyFixed = fov.getReferenceFrame().getPositionTransformation(BodyFixed, tdbTime);
Vector3 bsightBodyFixed = instrToBodyFixed.mxv(fov.getBoresight());
StateRecord sr =
new StateRecord(
new Body(SC_ID_String),
tdbTime,
BodyFixed,
new AberrationCorrection("LT+S"),
new Body(TARGET));
StateRecord sr = new StateRecord(
new Body(SC_ID_String), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET));
Vector3 scposBodyFixed = sr.getPosition();
PositionVector sunPos =
new StateRecord(
new Body("SUN"),
tdbTime,
BodyFixed,
new AberrationCorrection("LT+S"),
new Body(TARGET))
PositionVector sunPos = new StateRecord(
new Body("SUN"), tdbTime, BodyFixed, new AberrationCorrection("LT+S"), new Body(TARGET))
.getPosition();
double[] double3 = new double[3];
long cellID =
smallBodyModel.computeRayIntersection(
long cellID = smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), bsightBodyFixed.hat().toArray(), double3);
if (cellID == -1) return; // no boresight intersection
@@ -344,15 +332,13 @@ public class GetSpots implements TerrasaurTool {
}
} else {
// TODO: add ELLIPSE
System.err.printf(
"Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
System.err.printf("Instrument %s: Unsupported FOV shape %s\n", instrument.getName(), fov.getShape());
System.exit(0);
}
// check all of the boundary vectors for surface intersections
for (Vector3 vector : boundaryBodyFixed) {
cellID =
smallBodyModel.computeRayIntersection(
cellID = smallBodyModel.computeRayIntersection(
scposBodyFixed.toArray(), vector.hat().toArray(), double3);
if (cellID == -1) {
flag = 1;
@@ -377,8 +363,7 @@ public class GetSpots implements TerrasaurTool {
double emission = facetToSC.sep(facetNormal);
if (emission > Math.PI / 2) continue;
double dist =
findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
double dist = findDist(scposBodyFixed, bsightIntersectVector, facetCenter) * 1e3; // milliradians
if (dist < maxdist) {
Vector3 facetToSun = sunPos.sub(facetCenter);
@@ -398,8 +383,7 @@ public class GetSpots implements TerrasaurTool {
facetPlane.project(bsightIntersectVector).sub(facetCenter);
// 0 = North, 90 = East
double pos =
Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
double pos = Math.toDegrees(Math.acos(localNorth.hat().dot(bsightIntersectProjection.hat())));
if (localNorth.cross(bsightIntersectProjection).dot(facetNormal) > 0) pos = 360 - pos;
int nCovered = 0;
@@ -439,9 +423,7 @@ public class GetSpots implements TerrasaurTool {
}
}
StringBuilder output =
new StringBuilder(
String.format(
StringBuilder output = new StringBuilder(String.format(
"%s %.4f %5.1f %.1f %d %.1f %.1f %.1f",
sclkTime,
dist,
@@ -575,17 +557,19 @@ public class GetSpots implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("spice").required().hasArg().desc("SPICE metakernel").build());
options.addOption(Option.builder("obj").required().hasArg().desc("Shape file").build());
options.addOption(
Option.builder("instype")
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc(
"one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
.desc("SPICE metakernel")
.build());
options.addOption(
Option.builder("sclk")
Option.builder("obj").required().hasArg().desc("Shape file").build());
options.addOption(Option.builder("instype")
.required()
.hasArg()
.desc("one of OLA_LOW, OLA_HIGH, OTES, OVIRS_SCI, REXIS, REXIS_SXM, POLYCAM, MAPCAM, SAMCAM, or NAVCAM")
.build());
options.addOption(Option.builder("sclk")
.required()
.hasArg()
.desc(
@@ -596,19 +580,15 @@ file containing sclk values for instrument observation times. All values betwee
3/605862045.24157
END""")
.build());
options.addOption(
Option.builder("maxdist")
options.addOption(Option.builder("maxdist")
.required()
.hasArg()
.desc("maximum distance of boresight from facet center in milliradians")
.build());
options.addOption(
Option.builder("all-facets")
.desc(
"Optional. If present, entries for all facets will be output, even if there is no intersection.")
options.addOption(Option.builder("all-facets")
.desc("Optional. If present, entries for all facets will be output, even if there is no intersection.")
.build());
options.addOption(
Option.builder("verbose")
options.addOption(Option.builder("verbose")
.hasArg()
.desc(
"Optional. A level of 1 is equivalent to -all-facets. A level of 2 or higher will print out the boresight intersection position at each sclk.")
@@ -636,8 +616,7 @@ file containing sclk values for instrument observation times. All values betwee
int debugLevel = Integer.parseInt(cl.getOptionValue("verbose", "0"));
if (cl.hasOption("all-facets")) debugLevel = debugLevel == 0 ? 1 : debugLevel + 1;
GetSpots gs =
new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
GetSpots gs = new GetSpots(spicemetakernel, objfile, instrumenttype, sclkfile, distance, debugLevel);
try {
gs.process();
gs.printMap();

View File

@@ -120,13 +120,14 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
targetAccelerationJ2000 = new Vector3();
} else {
double duration = t1.getTDBSeconds() - t0.getTDBSeconds();
observerAccelerationJ2000 =
finalObserverJ2000
observerAccelerationJ2000 = finalObserverJ2000
.getVelocity()
.sub(initialObserverJ2000.getVelocity())
.scale(1. / duration);
targetAccelerationJ2000 =
finalTargetJ2000.getVelocity().sub(initialTargetJ2000.getVelocity()).scale(1. / duration);
targetAccelerationJ2000 = finalTargetJ2000
.getVelocity()
.sub(initialTargetJ2000.getVelocity())
.scale(1. / duration);
}
}
@@ -141,23 +142,19 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
try {
double delta = et.sub(t0).getMeasure();
Vector3 observerPosJ2000 =
initialObserverJ2000
Vector3 observerPosJ2000 = initialObserverJ2000
.getPosition()
.add(initialObserverJ2000.getVelocity().scale(delta))
.add(observerAccelerationJ2000.scale(0.5 * delta * delta));
Vector3 targetPosJ2000 =
initialTargetJ2000
Vector3 targetPosJ2000 = initialTargetJ2000
.getPosition()
.add(initialTargetJ2000.getVelocity().scale(delta))
.add(targetAccelerationJ2000.scale(0.5 * delta * delta));
Vector3 scPosJ2000 = observerPosJ2000.sub(targetPosJ2000);
Vector3 observerVelJ2000 =
initialObserverJ2000.getVelocity().add(observerAccelerationJ2000.scale(delta));
Vector3 targetVelJ2000 =
initialTargetJ2000.getVelocity().add(targetAccelerationJ2000.scale(delta));
Vector3 observerVelJ2000 = initialObserverJ2000.getVelocity().add(observerAccelerationJ2000.scale(delta));
Vector3 targetVelJ2000 = initialTargetJ2000.getVelocity().add(targetAccelerationJ2000.scale(delta));
Vector3 scVelJ2000 = observerVelJ2000.sub(targetVelJ2000);
StateVector scStateJ2000 = new StateVector(scPosJ2000, scVelJ2000);
@@ -174,7 +171,8 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
rayBundlePolyData.SetLines(rayBundleCells);
}
long id0 = rayBundlePoints.InsertNextPoint(lastState.getPosition().toArray());
long id1 = rayBundlePoints.InsertNextPoint(scStateBodyFixed.getPosition().toArray());
long id1 = rayBundlePoints.InsertNextPoint(
scStateBodyFixed.getPosition().toArray());
lastState = scStateBodyFixed;
vtkLine line = new vtkLine();
@@ -216,19 +214,19 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
StateVector scStateBodyFixed = getStateBodyFixed(et);
Vector3 closestPoint =
new Vector3(sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray()));
Vector3 closestPoint = new Vector3(
sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray()));
Vector3 toSurface = scStateBodyFixed.getPosition().sub(closestPoint);
double altitude = toSurface.norm();
// time it takes to get halfway to the surface
TDBDuration delta = new TDBDuration(altitude / (2 * scStateBodyFixed.getVelocity().norm()));
TDBDuration delta = new TDBDuration(
altitude / (2 * scStateBodyFixed.getVelocity().norm()));
boolean keepGoing = true;
while (keepGoing) {
LatitudinalCoordinates lc = new LatitudinalCoordinates(scStateBodyFixed.getPosition());
records.add(
new ImpactRecord(
records.add(new ImpactRecord(
et,
scStateBodyFixed,
new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude())));
@@ -237,33 +235,32 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
scStateBodyFixed = getStateBodyFixed(et);
closestPoint = new Vector3(sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray()));
closestPoint = new Vector3(
sbm.findClosestPoint(scStateBodyFixed.getPosition().toArray()));
toSurface = scStateBodyFixed.getPosition().sub(closestPoint);
altitude = toSurface.norm();
// check that we're still moving towards the target
if (scStateBodyFixed.getPosition().dot(scStateBodyFixed.getVelocity()) > 0) {
logger.warn(
"Stopping at {}; passed closest approach to the body center.",
et.toUTCString("ISOC", 3));
"Stopping at {}; passed closest approach to the body center.", et.toUTCString("ISOC", 3));
keepGoing = false;
}
if (altitude > finalHeight) {
delta = new TDBDuration(toSurface.norm() / (2 * scStateBodyFixed.getVelocity().norm()));
delta = new TDBDuration(toSurface.norm()
/ (2 * scStateBodyFixed.getVelocity().norm()));
} else if (altitude > finalStep) {
delta = new TDBDuration(finalStep / scStateBodyFixed.getVelocity().norm());
delta = new TDBDuration(
finalStep / scStateBodyFixed.getVelocity().norm());
} else {
keepGoing = false;
}
}
LatitudinalCoordinates lc = new LatitudinalCoordinates(scStateBodyFixed.getPosition());
records.add(
new ImpactRecord(
et,
scStateBodyFixed,
new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude())));
records.add(new ImpactRecord(
et, scStateBodyFixed, new LatitudinalCoordinates(altitude, lc.getLongitude(), lc.getLatitude())));
} catch (SpiceException e) {
logger.error(e.getLocalizedMessage(), e);
@@ -293,8 +290,8 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
}
}
private static Vector3 correctForAberration(
Vector3 targetLTS, Body observer, Body target, TDBTime t) throws SpiceException {
private static Vector3 correctForAberration(Vector3 targetLTS, Body observer, Body target, TDBTime t)
throws SpiceException {
RemoveAberration ra = new RemoveAberration(target, observer);
return ra.getGeometricPosition(t, targetLTS);
@@ -302,122 +299,94 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("date")
options.addOption(Option.builder("date")
.hasArgs()
.desc("Initial UTC date. Required if -sumFile is not used.")
.build());
options.addOption(
Option.builder("finalHeight")
options.addOption(Option.builder("finalHeight")
.hasArg()
.desc("Height above surface in meters to consider \"impact\". Default is 1 meter.")
.build());
options.addOption(
Option.builder("finalStep")
options.addOption(Option.builder("finalStep")
.hasArg()
.desc(
"Continue printing output below finalHeight in increments of approximate finalStep "
.desc("Continue printing output below finalHeight in increments of approximate finalStep "
+ "(in meters) until zero. Default is to stop at finalHeight.")
.build());
options.addOption(
Option.builder("frame")
options.addOption(Option.builder("frame")
.required()
.hasArg()
.desc("Required. Name of body fixed frame.")
.build());
options.addOption(
Option.builder("instrumentFrame")
options.addOption(Option.builder("instrumentFrame")
.hasArg()
.desc(
"SPICE ID for the camera reference frame. Required if -outputTransform "
.desc("SPICE ID for the camera reference frame. Required if -outputTransform "
+ "AND -sumFile are used.")
.build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("objFile")
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(
Option.builder("observer")
options.addOption(Option.builder("observer")
.required()
.hasArg()
.desc("Required. SPICE ID for the impactor.")
.build());
options.addOption(
Option.builder("observerFrame")
options.addOption(Option.builder("observerFrame")
.hasArg()
.desc(
"SPICE ID for the impactor's reference frame. Required if -outputTransform is used.")
.desc("SPICE ID for the impactor's reference frame. Required if -outputTransform is used.")
.build());
options.addOption(
Option.builder("outputTransform")
options.addOption(Option.builder("outputTransform")
.hasArg()
.desc(
"If present, write out a transform file that can be used by TransformShape to place "
.desc("If present, write out a transform file that can be used by TransformShape to place "
+ "coordinates in the spacecraft frame in the body fixed frame. The rotation "
+ " is evaluated at the sumfile time. The translation is evaluated at the impact time. "
+ "Requires -observerFrame option.")
.build());
options.addOption(
Option.builder("position")
options.addOption(Option.builder("position")
.hasArg()
.desc(
"Spacecraft to body vector in body fixed coordinates. Units are km. "
.desc("Spacecraft to body vector in body fixed coordinates. Units are km. "
+ "Spacecraft is at the origin to be consistent with sumFile convention.")
.build());
options.addOption(
Option.builder("spice")
options.addOption(Option.builder("spice")
.required()
.hasArgs()
.desc(
"Required. SPICE metakernel file containing body fixed frame and spacecraft kernels. "
.desc("Required. SPICE metakernel file containing body fixed frame and spacecraft kernels. "
+ "Can specify more than one kernel, separated by whitespace.")
.build());
options.addOption(
Option.builder("sumFile")
options.addOption(Option.builder("sumFile")
.hasArg()
.desc(
"Name of sum file to read. Coordinate system is assumed to be in the body "
.desc("Name of sum file to read. Coordinate system is assumed to be in the body "
+ "fixed frame with the spacecraft at the origin.")
.build());
options.addOption(
Option.builder("target")
options.addOption(Option.builder("target")
.required()
.hasArg()
.desc("Required. SPICE ID for the target.")
.build());
options.addOption(
Option.builder("trajectory")
options.addOption(Option.builder("trajectory")
.hasArg()
.desc(
"If present, name of output VTK file containing trajectory in body fixed coordinates.")
.desc("If present, name of output VTK file containing trajectory in body fixed coordinates.")
.build());
options.addOption(
Option.builder("verbosity")
options.addOption(Option.builder("verbosity")
.hasArg()
.desc("This option does nothing! Use -logLevel instead.")
.build());
options.addOption(
Option.builder("velocity")
options.addOption(Option.builder("velocity")
.hasArg()
.desc(
"Spacecraft velocity in J2000 relative to the body. Units are km/s. "
.desc("Spacecraft velocity in J2000 relative to the body. Units are km/s. "
+ "If not specified, velocity is calculated using SPICE.")
.build());
return options;
@@ -447,18 +416,14 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
Body target = new Body(cl.getOptionValue("target"));
final double finalHeight =
cl.hasOption("finalHeight")
? Double.parseDouble(cl.getOptionValue("finalHeight")) / 1e3
: 1e-3;
cl.hasOption("finalHeight") ? Double.parseDouble(cl.getOptionValue("finalHeight")) / 1e3 : 1e-3;
if (finalHeight <= 0) {
logger.warn("Argument to -finalHeight must be positive!");
System.exit(0);
}
final double finalStep =
cl.hasOption("finalStep")
? Double.parseDouble(cl.getOptionValue("finalStep")) / 1e3
: Double.MAX_VALUE;
cl.hasOption("finalStep") ? Double.parseDouble(cl.getOptionValue("finalStep")) / 1e3 : Double.MAX_VALUE;
if (finalStep <= 0) {
logger.warn("Argument to -finalStep must be positive!");
System.exit(0);
@@ -528,21 +493,19 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
}
if (Math.abs(scPosBodyFixed.sub(initialPos).norm()) > 0) {
logger.warn(
String.format(
logger.warn(String.format(
"Warning! Spacecraft position relative to target from SPICE is %s while input position is %s",
new Vector3(scPosBodyFixed), initialPos.toString()));
logger.warn(String.format("Difference is %e km", initialPos.sub(scPosBodyFixed).norm()));
logger.warn(String.format(
"Difference is %e km", initialPos.sub(scPosBodyFixed).norm()));
logger.warn("Continuing with input position");
}
Vector3 initialPosJ2000 = bodyFixed.getPositionTransformation(J2000, et).mxv(initialPos);
// relative to solar system barycenter in J2000
StateVector initialTargetJ2000 =
new StateRecord(target, et, J2000, abCorrNone, new Body(0)).getStateVector();
StateVector initialObserverJ2000 =
new StateVector(
StateVector initialTargetJ2000 = new StateRecord(target, et, J2000, abCorrNone, new Body(0)).getStateVector();
StateVector initialObserverJ2000 = new StateVector(
initialPosJ2000.add(initialTargetJ2000.getPosition()),
new StateRecord(observer, et, J2000, abCorrNone, new Body(0)).getVelocity());
@@ -552,20 +515,17 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
for (int i = 0; i < 3; i++) tmp[i] = Double.parseDouble(parts[i].trim());
Vector3 scVelJ2000 = new Vector3(tmp);
initialObserverJ2000 =
new StateVector(
initialObserverJ2000 = new StateVector(
initialObserverJ2000.getPosition(), scVelJ2000.add(initialTargetJ2000.getVelocity()));
StateRecord obs = new StateRecord(observer, et, J2000, abCorrNone, new Body(0));
logger.info(
String.format(
logger.info(String.format(
"spacecraft velocity relative to target from SPICE at %s is %s",
et.toUTCString("ISOC", 3), obs.getVelocity().sub(initialTargetJ2000.getVelocity())));
logger.info(String.format("Specified velocity is %s", scVelJ2000));
}
ImpactLocator ifsf =
new ImpactLocator(
ImpactLocator ifsf = new ImpactLocator(
J2000,
bodyFixed,
sbm,
@@ -624,21 +584,21 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
double lastET = Math.min(records.last().et.getTDBSeconds(), lastTarget);
lastET = Math.min(lastET, lastObserver);
TDBTime last = new TDBTime(lastET);
StateRecord finalObserverJ2000 =
new StateRecord(observer, last, J2000, abCorrNone, new Body(0));
StateRecord finalObserverJ2000 = new StateRecord(observer, last, J2000, abCorrNone, new Body(0));
StateRecord finalTargetJ2000 = new StateRecord(target, last, J2000, abCorrNone, new Body(0));
System.out.printf(
"%s: Observer velocity relative to SSB (J2000): %s%n",
last.toUTCString("ISOC", 3), finalObserverJ2000.getVelocity());
double duration = last.getTDBSeconds() - first.getTDBSeconds();
Vector3 observerAccelerationJ2000 =
finalObserverJ2000
Vector3 observerAccelerationJ2000 = finalObserverJ2000
.getVelocity()
.sub(initialObserverJ2000.getVelocity())
.scale(1. / duration);
Vector3 targetAccelerationJ2000 =
finalTargetJ2000.getVelocity().sub(initialTargetJ2000.getVelocity()).scale(1. / duration);
Vector3 targetAccelerationJ2000 = finalTargetJ2000
.getVelocity()
.sub(initialTargetJ2000.getVelocity())
.scale(1. / duration);
System.out.printf("Estimated time of impact %s\n", last.toUTCString("ISOC", 6));
System.out.printf("Estimated time to impact %.6f seconds\n", duration);
@@ -651,8 +611,7 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
System.out.println();
// Run again with constant accelerations for target and observer
ifsf =
new ImpactLocator(
ifsf = new ImpactLocator(
J2000,
bodyFixed,
sbm,
@@ -681,8 +640,7 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
for (ImpactRecord record : records) {
PositionVector p = record.scStateBodyFixed.getPosition();
VelocityVector v = record.scStateBodyFixed.getVelocity();
System.out.printf(
String.format(
System.out.printf(String.format(
"%26s, %13.6e, %13.6e, %13.6e, %13.6e, %13.6e, %13.6e, %12.4f, %12.4f, %12.4f\n",
record.et.toUTCString("ISOC", 6),
p.getElt(0),
@@ -728,7 +686,8 @@ public class ImpactLocator implements UnivariateFunction, TerrasaurTool {
ReferenceFrame instrFrame = null;
if (cl.hasOption("instrumentFrame"))
instrFrame = new ReferenceFrame(cl.getOptionValue("instrumentFrame").toUpperCase());
instrFrame = new ReferenceFrame(
cl.getOptionValue("instrumentFrame").toUpperCase());
if (instrFrame == null) {
logger.error("-instrumentFrame needed for -outputTransform. Exiting.");
System.exit(0);

View File

@@ -162,16 +162,14 @@ and quality planes are saved out.""";
if (namingConvention.isEmpty()) {
File outFname = new File(outfile);
if (outFname.isDirectory()) {
String errMesg =
"ERROR! No naming convention specified but output file:"
String errMesg = "ERROR! No naming convention specified but output file:"
+ outfile
+ " is a directory! Must be a full path to an output file!";
throw new RuntimeException(errMesg);
}
}
DataInputStream is =
new DataInputStream(new BufferedInputStream(new FileInputStream(mapletFile)));
DataInputStream is = new DataInputStream(new BufferedInputStream(new FileInputStream(mapletFile)));
DataInput sigmaInput = null;
BufferedInputStream sigmabis = null;
if (sigmasFile != null) {
@@ -179,14 +177,11 @@ and quality planes are saved out.""";
Path filePath = Paths.get(sigmasFile);
if (!Files.exists(filePath)) {
System.out.println(
"WARNING! sigmas file:"
+ filePath.toAbsolutePath()
+ " not found! Sigmas will default to 0!");
"WARNING! sigmas file:" + filePath.toAbsolutePath() + " not found! Sigmas will default to 0!");
} else {
sigmabis =
new BufferedInputStream(new FileInputStream(filePath.toAbsolutePath().toString()));
sigmaInput =
swapBytes ? new LittleEndianDataInputStream(sigmabis) : new DataInputStream(sigmabis);
sigmabis = new BufferedInputStream(
new FileInputStream(filePath.toAbsolutePath().toString()));
sigmaInput = swapBytes ? new LittleEndianDataInputStream(sigmabis) : new DataInputStream(sigmabis);
}
}
DataInput qualityInput = null;
@@ -195,15 +190,13 @@ and quality planes are saved out.""";
System.out.println("Parsing " + qualityFile + " for quality values.");
Path filePath = Paths.get(qualityFile);
if (!Files.exists(filePath)) {
System.out.println(
"WARNING! quality file:"
System.out.println("WARNING! quality file:"
+ filePath.toAbsolutePath()
+ " not found! Quality values will default to 0!");
} else {
qualbis =
new BufferedInputStream(new FileInputStream(filePath.toAbsolutePath().toString()));
qualityInput =
swapBytes ? new LittleEndianDataInputStream(qualbis) : new DataInputStream(qualbis);
qualbis = new BufferedInputStream(
new FileInputStream(filePath.toAbsolutePath().toString()));
qualityInput = swapBytes ? new LittleEndianDataInputStream(qualbis) : new DataInputStream(qualbis);
}
}
@@ -374,8 +367,7 @@ and quality planes are saved out.""";
// define this SIGMA as a measurement of height uncertainty
hdrBuilder.setVCbyHeaderTag(
HeaderTag.SIG_DEF, "height uncertainty", HeaderTag.SIG_DEF.comment());
hdrBuilder.setVCbyHeaderTag(HeaderTag.SIG_DEF, "height uncertainty", HeaderTag.SIG_DEF.comment());
// headerValues.put(HeaderTag.SIG_DEF.toString(),
// new HeaderCard(HeaderTag.SIG_DEF.toString(), "Uncertainty", HeaderTag.SIG_DEF.comment()));
@@ -407,8 +399,7 @@ and quality planes are saved out.""";
// create FitsData object. Stores data and relevant information about the data
FitsDataBuilder fitsDataB = new FitsDataBuilder(data, isGlobal);
FitsData fitsData =
fitsDataB
FitsData fitsData = fitsDataB
.setV(V)
.setAltProdType(productType)
.setDataSource(dataSource)
@@ -438,9 +429,7 @@ and quality planes are saved out.""";
// create llrData object. Stores lat,lon,radius information. Needed to fill out fits header
FitsDataBuilder llrDataB = new FitsDataBuilder(llrData, isGlobal);
FitsData llrFitsData =
llrDataB
.setV(V)
FitsData llrFitsData = llrDataB.setV(V)
.setAltProdType(productType)
.setDataSource(dataSource)
.setU(ux, UnitDir.UX)
@@ -468,35 +457,35 @@ and quality planes are saved out.""";
planeList.add(PlaneInfo.SIGMA);
planeList.add(PlaneInfo.QUALITY);
saveDTMFits(
hdrBuilder, fitsData, planeList, namingConvention, productType, isGlobal, outfile);
saveDTMFits(hdrBuilder, fitsData, planeList, namingConvention, productType, isGlobal, outfile);
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("input-map").required().hasArg().desc("input maplet file").build());
options.addOption(
Option.builder("output-fits").required().hasArg().desc("output FITS file").build());
options.addOption(
Option.builder("exclude-position")
options.addOption(Option.builder("input-map")
.required()
.hasArg()
.desc("input maplet file")
.build());
options.addOption(Option.builder("output-fits")
.required()
.hasArg()
.desc("output FITS file")
.build());
options.addOption(Option.builder("exclude-position")
.desc(
"Only save out the height, albedo, sigma, quality, and hazard planes to the output file. Used for creating NFT MLNs.")
.build());
options.addOption(
Option.builder("noHazard")
options.addOption(Option.builder("noHazard")
.desc(
"Only used in conjunction with -exclude-position. If present then the NFT MLN will NOT include a Hazard plane initially filled with all ones.")
.build());
options.addOption(
Option.builder("hazardVal")
options.addOption(Option.builder("hazardVal")
.hasArg()
.desc(
"Only used in conjunction with -exclude-position. If present then will use the specified value.")
.desc("Only used in conjunction with -exclude-position. If present then will use the specified value.")
.build());
options.addOption(
Option.builder("lsb")
options.addOption(Option.builder("lsb")
.desc(
"""
By default the sigmas and quality binary files are assumed to be in big-endian floating
@@ -504,43 +493,35 @@ and quality planes are saved out.""";
in little-endian format. For example, products created by SPC executables are OS
dependent and intel Linux OSes use little-endian.""")
.build());
options.addOption(
Option.builder("scalFactor")
options.addOption(Option.builder("scalFactor")
.hasArg()
.desc(
"Enter scale factor used to convert scale to ground sample distance in mm i.e. for SPC maplets the scale factor is 1000000 (km to mm). Set to 1.0e6 by default.")
.build());
options.addOption(
Option.builder("sigmas-file")
options.addOption(Option.builder("sigmas-file")
.hasArg()
.desc(
"Path to binary sigmas file containing sigma values per pixel, in same order as the maplet file. If this option is omitted, the sigma plane is set to all zeros.")
.build());
options.addOption(
Option.builder("sigsum-file")
options.addOption(Option.builder("sigsum-file")
.hasArg()
.desc(
"Path to ascii sigma summary file, containing the overall sigma value of the maplet.")
.desc("Path to ascii sigma summary file, containing the overall sigma value of the maplet.")
.build());
options.addOption(
Option.builder("sigmaScale")
options.addOption(Option.builder("sigmaScale")
.hasArg()
.desc(
"Scale sigmas from sigmas-file by <value>. Only applicable if -sigmas-file is used. Defaults to 1 if not specified.")
.build());
options.addOption(
Option.builder("mapname")
options.addOption(Option.builder("mapname")
.hasArg()
.desc("Sets the MAP_NAME fits keyword to <mapname>. Default is 'Non-NFT DTM'")
.build());
options.addOption(
Option.builder("quality-file")
options.addOption(Option.builder("quality-file")
.hasArg()
.desc(
"Path to binary quality file containing quality values. If this option is omitted, the quality plane is set to all zeros.")
.build());
options.addOption(
Option.builder("configFile")
options.addOption(Option.builder("configFile")
.hasArg()
.desc(
"""
@@ -549,8 +530,7 @@ and quality planes are saved out.""";
that are required by the ALTWG SIS. The values may not be populated or UNK if they cannot
be derived from the data itself. This configuration file is a way to populate those values.""")
.build());
options.addOption(
Option.builder("namingConvention")
options.addOption(Option.builder("namingConvention")
.hasArg()
.desc(
"""
@@ -626,8 +606,7 @@ and quality planes are saved out.""";
public static String parseSigsumFile(String sigsumFile) throws IOException {
if (!Files.exists(Paths.get(sigsumFile))) {
System.out.println(
"Warning! Sigmas summary file:"
System.out.println("Warning! Sigmas summary file:"
+ sigsumFile
+ " does not exist! Not "
+ "able to parse sigma summary.");
@@ -860,8 +839,7 @@ and quality planes are saved out.""";
// possible renaming of output file
NameConvention nameConvention = NameConvention.parseNameConvention(namingConvention);
File[] outFiles =
NamingFactory.getBaseNameAndCrossRef(
nameConvention, hdrBuilder, productType, isGlobal, outfile);
NamingFactory.getBaseNameAndCrossRef(nameConvention, hdrBuilder, productType, isGlobal, outfile);
// check if cross-ref file is not null. If so then output file was renamed to naming convention.
crossrefFile = outFiles[1];
@@ -881,7 +859,6 @@ and quality planes are saved out.""";
}
outFitsFname = new File(outputFolder, baseOutputName + ".fits").getAbsolutePath();
}
ProductFits.saveDataCubeFits(
fitsData, planeList, outFitsFname, hdrBuilder, hdrType, crossrefFile);
ProductFits.saveDataCubeFits(fitsData, planeList, outFitsFname, hdrBuilder, hdrType, crossrefFile);
}
}

View File

@@ -22,6 +22,9 @@
*/
package terrasaur.apps;
import com.beust.jcommander.JCommander;
import com.beust.jcommander.Parameter;
import com.beust.jcommander.ParameterException;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
@@ -33,9 +36,6 @@ import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import com.beust.jcommander.JCommander;
import com.beust.jcommander.Parameter;
import com.beust.jcommander.ParameterException;
import org.apache.commons.cli.Options;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
@@ -53,8 +53,6 @@ public class OBJ2DSK implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Convert an OBJ shape file to SPICE DSK format.";
@@ -71,11 +69,17 @@ public class OBJ2DSK implements TerrasaurTool {
jcUsage.setColumnSize(100);
jcUsage.usage(builder, 4, arguments.commandDescription);
return builder.toString();
}
private enum DSK_KEYS {
SURF_NAME, CENTER_NAME, REFFRAME_NAME, NAIF_SURFNAME, NAIF_SURFCODE, NAIF_SURFBODY, METAK, COMMENTFILE
SURF_NAME,
CENTER_NAME,
REFFRAME_NAME,
NAIF_SURFNAME,
NAIF_SURFCODE,
NAIF_SURFBODY,
METAK,
COMMENTFILE
}
private static class Arguments {
@@ -87,122 +91,166 @@ public class OBJ2DSK implements TerrasaurTool {
@Parameter(names = "-help", help = true)
private boolean help;
@Parameter(names = "--latMin", description = "<latMin> Minimum latitude of OBJ in degrees.",
required = false, order = 0)
@Parameter(
names = "--latMin",
description = "<latMin> Minimum latitude of OBJ in degrees.",
required = false,
order = 0)
private double latMin = -90D;
@Parameter(names = "--latMax", description = "<latMax> Maximum latitude of OBJ in degrees.",
required = false, order = 1)
@Parameter(
names = "--latMax",
description = "<latMax> Maximum latitude of OBJ in degrees.",
required = false,
order = 1)
private double latMax = 90D;
@Parameter(names = "--lonMin", description = "<lonMin> Minimum longitude of OBJ in degrees.",
required = false, order = 2)
@Parameter(
names = "--lonMin",
description = "<lonMin> Minimum longitude of OBJ in degrees.",
required = false,
order = 2)
private double lonMin = 0;
@Parameter(names = "--lonMax", description = "<lonMax> Maximum longitude of OBJ in degrees.",
required = false, order = 3)
@Parameter(
names = "--lonMax",
description = "<lonMax> Maximum longitude of OBJ in degrees.",
required = false,
order = 3)
private double lonMax = 360D;
@Parameter(names = "--fitsFile",
description = "<filename> path to DTM fits file containing lat,lon"
@Parameter(
names = "--fitsFile",
description =
"<filename> path to DTM fits file containing lat,lon"
+ " information as planes. Assumes PLANE1=latitude, PLANE2=longitude. Use in place of specifying lat/lon min/max values.",
required = false, order = 4)
required = false,
order = 4)
private String fitsFile = "";
@Parameter(names = "--fine-scale",
@Parameter(
names = "--fine-scale",
description = "<fine-voxel-scale> Floating point value representing the "
+ " ratio of the spatial index's fine voxel edge length to the average plate extent. "
+ " The 'extent' of a plate in a given coordinate direction is the difference between the maximum and minimum "
+ " values of that coordinate attained on the plate. Only required if mkdsk version is "
+ " lower than 66.",
required = false, order = 11)
required = false,
order = 11)
double fineVoxScale = Double.NaN;
@Parameter(names = "--coarse-scale",
@Parameter(
names = "--coarse-scale",
description = " <coarse-voxel-scale>"
+ " Integer value representing the ratio of the edge length of coarse voxels to"
+ " fine voxels. The number must be large enough that the total number of coarse"
+ " voxels is less than the value of MAXCGR, which is currently 1E5."
+ " Only required if mkdsk version is lower than 66.",
required = false, order = 12)
required = false,
order = 12)
Integer coarseVoxScale = -999;
@Parameter(names = "--useSetupFile",
@Parameter(
names = "--useSetupFile",
description = "<inputSetupFile> Use <inputSetupFile>"
+ " instead of the default setup file created by the tool.",
required = false, order = 13)
required = false,
order = 13)
String inputSetup = "";
@Parameter(names = "--writesetupFile",
@Parameter(
names = "--writesetupFile",
description = "<outputSetupFile> Write the setup file"
+ " to the specified path instead of writing it as a temporary file which gets deleted"
+ " after execution.",
required = false, order = 14)
required = false,
order = 14)
String outsetupFname = "";
@Parameter(names = "--keepTempFiles",
@Parameter(
names = "--keepTempFiles",
description = "enable this to prevent setup files"
+ " from being deleted. Used for debugging purposes to see what is being sent"
+ " to mkdsk.",
required = false, order = 15)
required = false,
order = 15)
boolean keepTmpFiles = false;
@Parameter(names = "--mkFile",
@Parameter(
names = "--mkFile",
description = "<spice-meta-kernel-file> path to SPICE meta kernel file."
+ " Metakernel only needs to point to leap seconds kernel and a frames kernel that contains"
+ " the digital ID to CENTER_NAME and REF_FRAME_NAME lookup table."
+ " This argument is REQUIRED if user does NOT supply a setupFile!",
required = false, order = 4)
required = false,
order = 4)
String mkFile = "";
@Parameter(names = "--surfName",
@Parameter(
names = "--surfName",
description = "<surfaceName> Allows user to modify the "
+ " SURFACE_NAME (name of the specific shape data set for the central body)"
+ " used in the default setup file created by the tool. This is a required"
+ " keyword in the setup file.",
required = false, order = 5)
required = false,
order = 5)
String surfaceName = "BENNU";
@Parameter(names = "--centName", description = "<centerName> Allows user to modify the "
@Parameter(
names = "--centName",
description = "<centerName> Allows user to modify the "
+ " CENTER_NAME (central body name) used in the default setup file created by the tool. "
+ " Can also be an ID code. This is a required keyword in the setup file.",
required = false, order = 6)
required = false,
order = 6)
String centerName = "BENNU";
@Parameter(names = "--refName", description = "<refFrameName> Allows user to modify the "
@Parameter(
names = "--refName",
description = "<refFrameName> Allows user to modify the "
+ " REF_FRAME_NAME (reference frame name) used in the default setup file created by the tool. "
+ " This is a required keyword in the setup file.", required = false, order = 7)
+ " This is a required keyword in the setup file.",
required = false,
order = 7)
String refFrameName = "IAU_BENNU";
@Parameter(names = "--naif_surfName",
@Parameter(
names = "--naif_surfName",
description = "<naif surface name> Allows user to add the "
+ " NAIF_SURFACE_NAME to the default setup file created by the tool. "
+ " This may be needed under certain conditions. Optional keyword. "
+ " Default is not to use it.",
required = false, order = 8)
required = false,
order = 8)
String naifSurfName = "";
@Parameter(names = "--naif_surfCode",
@Parameter(
names = "--naif_surfCode",
description = "<integer ID surface code> Allows user to add the "
+ " NAIF_SURFACE_CODE to the default setup file created by the tool. "
+ " Allows the tool to associate this ID code to the NAIF_SURFACE_NAME. Optional keyword. "
+ " Default is not to use it.",
required = false, order = 9)
required = false,
order = 9)
String naifSurfCode = "";
@Parameter(names = "--naif_surfBody",
@Parameter(
names = "--naif_surfBody",
description = "<integer body ID code> Allows user to add the "
+ " NAIF_SURFACE_BODY to the default setup file created by the tool. "
+ " This may be needed under certain conditions. Optional keyword."
+ " Default is not to use it.",
required = false, order = 10)
required = false,
order = 10)
String naifSurfBody = "";
@Parameter(names = "--cmtFile",
@Parameter(
names = "--cmtFile",
description = "<commentFile> Specify the comment file"
+ " that mkdsk will add to the DSK. Comment file is an ASCII file containing"
+ " additional information about the DSK. Default is single space.",
required = false, order = 11)
required = false,
order = 11)
String cmtFile = " ";
@Parameter(names = "-shortDescription", hidden = true)
@@ -222,7 +270,6 @@ public class OBJ2DSK implements TerrasaurTool {
+ " output dsk file\n"
+ " NOTE: MUST set --metakFile if not supplying a custom setup file!")
private List<String> files = new ArrayList<>();
}
private final Double fineVoxScale;
@@ -269,7 +316,6 @@ public class OBJ2DSK implements TerrasaurTool {
System.out.println(ex.getMessage());
command = new JCommander(arg);
System.exit(1);
}
if ((args.length < 1) || (arg.help)) {
@@ -323,7 +369,6 @@ public class OBJ2DSK implements TerrasaurTool {
NativeLibraryLoader.loadVtkLibraries();
OBJ2DSK obj2dsk;
if ((Double.isNaN(arg.fineVoxScale)) || (arg.coarseVoxScale < 0)) {
obj2dsk = new OBJ2DSK();
@@ -355,8 +400,8 @@ public class OBJ2DSK implements TerrasaurTool {
}
System.out.println("Creating default setup file");
setupFile = createSetup(latLonMinMax, obj2dsk.fineVoxScale, obj2dsk.coarseVoxScale, dskParams,
outsetupFname);
setupFile =
createSetup(latLonMinMax, obj2dsk.fineVoxScale, obj2dsk.coarseVoxScale, dskParams, outsetupFname);
if (keepTmpFiles) {
System.out.println("setup file created here:" + outsetupFname);
} else {
@@ -372,7 +417,6 @@ public class OBJ2DSK implements TerrasaurTool {
throw new RuntimeException(errMesg);
}
System.out.println("Using custom setup file:" + inputSetup);
}
obj2dsk.run(inFile, outFile, latLonMinMax, setupFile, outsetupFname, keepTmpFiles);
@@ -416,8 +460,14 @@ public class OBJ2DSK implements TerrasaurTool {
}
public void run(String infile, String outfile, Map<String, Double> latLonMinMax, File setupFile,
String outsetupFname, boolean keepTmpFiles) throws Exception {
public void run(
String infile,
String outfile,
Map<String, Double> latLonMinMax,
File setupFile,
String outsetupFname,
boolean keepTmpFiles)
throws Exception {
System.out.println("Running OBJ2DSK.");
System.out.println("FINE_VOXEL_SCALE = " + Double.toString(fineVoxScale));
@@ -454,15 +504,13 @@ public class OBJ2DSK implements TerrasaurTool {
PolyDataUtil.saveShapeModelAsOBJ(inpolydata, shapeModel.getAbsolutePath());
// Delete dsk file if already exists since otherwise mkdsk will complain
if (new File(outfile).isFile())
new File(outfile).delete();
if (new File(outfile).isFile()) new File(outfile).delete();
String command = "mkdsk -setup " + setupFile.getAbsolutePath() + " -input "
+ shapeModel.getAbsolutePath() + " -output " + outfile;
String command = "mkdsk -setup " + setupFile.getAbsolutePath() + " -input " + shapeModel.getAbsolutePath()
+ " -output " + outfile;
ProcessUtils.runProgramAndWait(command, null, false);
}
/**
* Create the setup file for mkdsk executable.
*
@@ -473,8 +521,12 @@ public class OBJ2DSK implements TerrasaurTool {
* @param setupFname
* @return
*/
private static File createSetup(Map<String, Double> latLonCorners, Double fineVoxScale,
Integer coarseVoxScale, Map<DSK_KEYS, String> dskParams, String setupFname) {
private static File createSetup(
Map<String, Double> latLonCorners,
Double fineVoxScale,
Integer coarseVoxScale,
Map<DSK_KEYS, String> dskParams,
String setupFname) {
// evaluate latlon corners. Exit program if any are NaN.
evaluateCorners(latLonCorners);
@@ -492,8 +544,8 @@ public class OBJ2DSK implements TerrasaurTool {
} else {
setupFile = new File(setupFname);
}
System.out
.println("Setup file for mkdsk created here:" + setupFile.getAbsolutePath().toString());
System.out.println("Setup file for mkdsk created here:"
+ setupFile.getAbsolutePath().toString());
// relativize the path to the metakernel file. Do this because mkdsk has a limit on the string
// length to the metakernel. Get normalized absolute path to mkFile in case the user enters a
@@ -509,13 +561,12 @@ public class OBJ2DSK implements TerrasaurTool {
String spicefile = relPath.toString();
if (spicefile.length() > 80) {
System.out.println("Error: pointer to SPICE metakernel kernel file may not be longer than"
+ " 80 characters.");
System.out.println(
"Error: pointer to SPICE metakernel kernel file may not be longer than" + " 80 characters.");
System.out.println("The paths inside the metakernel file can be as long as 255 characters.");
System.exit(1);
}
// create the content of setup file
StringBuilder sb = new StringBuilder();
sb.append("\\begindata\n");
@@ -533,23 +584,20 @@ public class OBJ2DSK implements TerrasaurTool {
sb.append("INPUT_DATA_UNITS = ( 'ANGLES = DEGREES'\n");
sb.append(" 'DISTANCES = KILOMETERS' )\n");
sb.append("COORDINATE_SYSTEM = 'LATITUDINAL'\n");
String valueString = String.format("MINIMUM_LATITUDE = %.5f\n",
latLonCorners.get(HeaderTag.MINLAT.toString()));
String valueString =
String.format("MINIMUM_LATITUDE = %.5f\n", latLonCorners.get(HeaderTag.MINLAT.toString()));
sb.append(valueString);
// out.write("MAXIMUM_LATITUDE = 90\n");
valueString = String.format("MAXIMUM_LATITUDE = %.5f\n",
latLonCorners.get(HeaderTag.MAXLAT.toString()));
valueString = String.format("MAXIMUM_LATITUDE = %.5f\n", latLonCorners.get(HeaderTag.MAXLAT.toString()));
sb.append(valueString);
// out.write("MINIMUM_LONGITUDE = -180\n");
valueString = String.format("MINIMUM_LONGITUDE = %.5f\n",
latLonCorners.get(HeaderTag.MINLON.toString()));
valueString = String.format("MINIMUM_LONGITUDE = %.5f\n", latLonCorners.get(HeaderTag.MINLON.toString()));
sb.append(valueString);
// out.write("MAXIMUM_LONGITUDE = 180\n");
valueString = String.format("MAXIMUM_LONGITUDE = %.5f\n",
latLonCorners.get(HeaderTag.MAXLON.toString()));
valueString = String.format("MAXIMUM_LONGITUDE = %.5f\n", latLonCorners.get(HeaderTag.MAXLON.toString()));
sb.append(valueString);
sb.append("DATA_TYPE = 2\n");
@@ -593,7 +641,6 @@ public class OBJ2DSK implements TerrasaurTool {
System.exit(1);
}
return setupFile;
}
/**
@@ -611,5 +658,4 @@ public class OBJ2DSK implements TerrasaurTool {
}
}
}
}

View File

@@ -172,7 +172,9 @@ public class PointCloudFormatConverter implements TerrasaurTool {
double lon = Math.toRadians(Double.parseDouble(parts[0].trim()));
double lat = Math.toRadians(Double.parseDouble(parts[1].trim()));
double range = Double.parseDouble(parts[2].trim());
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray();
double[] xyz = new Vector3D(lon, lat)
.scalarMultiply(range)
.toArray();
pointsXYZ.InsertNextPoint(xyz);
} else {
double[] xyz = new double[3];
@@ -191,8 +193,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
case BIN3:
case BIN4:
case BIN7:
try (DataInputStream dis =
new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
try (DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)))) {
while (dis.available() > 0) {
if (inFormat == FORMATS.BIN7) {
// skip time field
@@ -202,7 +203,8 @@ public class PointCloudFormatConverter implements TerrasaurTool {
double lon = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double lat = Math.toRadians(BinaryUtils.readDoubleAndSwap(dis));
double range = BinaryUtils.readDoubleAndSwap(dis);
double[] xyz = new Vector3D(lon, lat).scalarMultiply(range).toArray();
double[] xyz =
new Vector3D(lon, lat).scalarMultiply(range).toArray();
pointsXYZ.InsertNextPoint(xyz);
} else {
double[] xyz = new double[3];
@@ -248,8 +250,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
BoundingBox clipped = bbox.getScaledBoundingBox(clip);
vtkPoints clippedPoints = new vtkPoints();
for (int i = 0; i < pointsXYZ.GetNumberOfPoints(); i++) {
if (clipped.contains(pointsXYZ.GetPoint(i)))
clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
if (clipped.contains(pointsXYZ.GetPoint(i))) clippedPoints.InsertNextPoint(pointsXYZ.GetPoint(i));
}
pointsXYZ = clippedPoints;
polyData = null;
@@ -305,9 +306,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
}
} else {
if (halfSize < 0 || groundSampleDistance < 0) {
System.out.printf(
"Must supply -halfSize and -groundSampleDistance for %s output\n",
outFormat);
logger.error("Must supply -halfSize and -groundSampleDistance for {} output", outFormat);
return;
}
@@ -366,98 +365,80 @@ public class PointCloudFormatConverter implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("inputFormat")
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc(
"Format of input file. If not present format will be inferred from inputFile extension.")
.desc("Format of input file. If not present format will be inferred from inputFile extension.")
.build());
options.addOption(
Option.builder("inputFile")
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(
Option.builder("outputFormat")
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc(
"Format of output file. If not present format will be inferred from outputFile extension.")
.desc("Format of output file. If not present format will be inferred from outputFile extension.")
.build());
options.addOption(
Option.builder("outputFile")
options.addOption(Option.builder("outputFile")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(
Option.builder("inllr")
options.addOption(Option.builder("inllr")
.desc(
"Only used with ASCII or BINARY formats. If present, input values are assumed to be lon, lat, rad. Default is x, y, z.")
.build());
options.addOption(
Option.builder("outllr")
options.addOption(Option.builder("outllr")
.desc(
"Only used with ASCII or BINARY formats. If present, output values will be lon, lat, rad. Default is x, y, z.")
.build());
options.addOption(
Option.builder("centerXYZ")
options.addOption(Option.builder("centerXYZ")
.hasArg()
.desc(
"Only used to generate OBJ output. Center output shape on supplied coordinates. Specify XYZ coordinates as three floating point numbers separated"
+ " by commas.")
.build());
options.addOption(
Option.builder("centerLonLat")
options.addOption(Option.builder("centerLonLat")
.hasArg()
.desc(
"Only used to generate OBJ output. Center output shape on supplied lon,lat. Specify lon,lat in degrees as floating point numbers separated"
+ " by a comma. Shape will be centered on the point closest to this lon,lat pair.")
.build());
options.addOption(
Option.builder("halfSize")
options.addOption(Option.builder("halfSize")
.hasArg()
.desc(
"Only used to generate OBJ output. Used with -groundSampleDistance to resample to a uniform grid. Grid dimensions are (2*halfSize+1)x(2*halfSize+1).")
.build());
options.addOption(
Option.builder("groundSampleDistance")
options.addOption(Option.builder("groundSampleDistance")
.hasArg()
.desc(
"Used with -halfSize to resample to a uniform grid. Spacing between grid points. Only used to generate OBJ output. "
+ "Units are the same as the input file, usually km.")
.build());
options.addOption(
Option.builder("mapRadius")
options.addOption(Option.builder("mapRadius")
.hasArg()
.desc(
"Only used to generate OBJ output. Used with -centerXYZ to resample to a uniform grid. Only include points within "
+ "mapRadius*groundSampleDistance*halfSize of centerXYZ. Default value is sqrt(2).")
.build());
options.addOption(
Option.builder("gmtArgs")
options.addOption(Option.builder("gmtArgs")
.hasArg()
.longOpt("gmt-args")
.desc(
"Only used to generate OBJ output. Pass additional options to GMTSurface. May be used multiple times, use once per additional argument.")
.build());
options.addOption(
Option.builder("clip")
options.addOption(Option.builder("clip")
.hasArg()
.desc(
"Shrink bounding box to a relative size of <arg> and clip any points outside of it. Default is 1 (no clipping).")
@@ -484,12 +465,10 @@ public class PointCloudFormatConverter implements TerrasaurTool {
boolean inLLR = cl.hasOption("inllr");
boolean outLLR = cl.hasOption("outllr");
FORMATS inFormat =
cl.hasOption("inputFormat")
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
FORMATS outFormat =
cl.hasOption("outputFormat")
FORMATS outFormat = cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
@@ -516,8 +495,7 @@ public class PointCloudFormatConverter implements TerrasaurTool {
if (cl.hasOption("centerLonLat")) {
String[] params = cl.getOptionValue("centerLonLat").split(",");
Vector3D lcDir =
new Vector3D(
Vector3D lcDir = new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim())));
double[] center = null;

View File

@@ -0,0 +1,399 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.awt.geom.Path2D;
import java.awt.geom.Point2D;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import nom.tam.fits.FitsException;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.commons.math3.geometry.euclidean.twod.Vector2D;
import org.apache.commons.math3.geometry.euclidean.twod.hull.MonotoneChain;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import picante.math.coords.LatitudinalVector;
import spice.basic.SpiceException;
import spice.basic.Vector3;
import terrasaur.enums.FORMATS;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.VectorStatistics;
import terrasaur.utils.tessellation.StereographicProjection;
import vtk.vtkPoints;
public class PointCloudOverlap implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Find points in a point cloud which overlap a reference point cloud.";
}
@Override
public String fullDescription(Options options) {
String header = "";
String footer =
"\nThis program finds all points in the input point cloud which overlap the points in a reference point cloud.\n\n"
+ "Supported input formats are ASCII, BINARY, L2, OBJ, and VTK. Supported output formats are ASCII, BINARY, L2, and VTK. "
+ "ASCII format is white spaced delimited x y z coordinates. BINARY files must contain double precision x y z coordinates.\n\n"
+ "A plane is fit to the reference point cloud and all points in each cloud are projected onto this plane. Any point in the "
+ "projected input cloud which falls within the outline of the projected reference cloud is considered to be overlapping.";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private Path2D.Double polygon;
private StereographicProjection proj;
/*- Useful for debugging
private vtkPoints ref2DPoints;
private vtkPoints input2DPoints;
private vtkPolyData polygonPolyData;
private vtkCellArray polygonCells;
private vtkPoints polygonPoints;
private vtkDoubleArray polygonSuccessArray;
*/
public PointCloudOverlap(Collection<double[]> refPoints) {
if (refPoints != null) {
VectorStatistics vStats = new VectorStatistics();
for (double[] pt : refPoints) vStats.add(new Vector3(pt));
Vector3D centerXYZ = vStats.getMean();
proj = new StereographicProjection(new LatitudinalVector(1, centerXYZ.getDelta(), centerXYZ.getAlpha()));
createRefPolygon(refPoints);
}
}
public StereographicProjection getProjection() {
return proj;
}
private void createRefPolygon(Collection<double[]> refPoints) {
List<Vector2D> stereographicPoints = new ArrayList<>();
for (double[] refPt : refPoints) {
Vector3D point3D = new Vector3D(refPt);
Point2D point = proj.forward(point3D.getDelta(), point3D.getAlpha());
stereographicPoints.add(new Vector2D(point.getX(), point.getY()));
}
/*-
ref2DPoints = new vtkPoints();
input2DPoints = new vtkPoints();
polygonPolyData = new vtkPolyData();
polygonCells = new vtkCellArray();
polygonPoints = new vtkPoints();
polygonSuccessArray = new vtkDoubleArray();
polygonSuccessArray.SetName("success")
polygonPolyData.SetPoints(polygonPoints);
polygonPolyData.SetLines(polygonCells);
polygonPolyData.GetCellData().AddArray(polygonSuccessArray);
for (Vector2D refPoint : refPoints)
ref2DPoints.InsertNextPoint(refPoint.getX(), refPoint.getY(), 0);
*/
MonotoneChain mc = new MonotoneChain();
List<Vector2D> vertices = new ArrayList<>(mc.findHullVertices(stereographicPoints));
/*-
for (int i = 1; i < vertices.size(); i++) {
Vector2D lastPt = vertices.get(i - 1);
Vector2D thisPt = vertices.get(i);
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
thisPt.getY(), lastPt.distance(thisPt));
}
Vector2D lastPt = vertices.get(vertices.size() - 1);
Vector2D thisPt = vertices.get(0);
System.out.printf("%f %f to %f %f distance %f\n", lastPt.getX(), lastPt.getY(), thisPt.getX(),
thisPt.getY(), lastPt.distance(thisPt));
*/
// int id0 = 0;
for (Vector2D vertex : vertices) {
// int id1 = polygonPoints.InsertNextPoint(vertex.getX(), vertex.getY(), 0);
if (polygon == null) {
polygon = new Path2D.Double();
polygon.moveTo(vertex.getX(), vertex.getY());
} else {
polygon.lineTo(vertex.getX(), vertex.getY());
/*-
vtkLine line = new vtkLine();
line.GetPointIds().SetId(0, id0);
line.GetPointIds().SetId(1, id1);
polygonCells.InsertNextCell(line);
*/
}
// id0 = id1;
}
polygon.closePath();
/*-
vtkPolyDataWriter writer = new vtkPolyDataWriter();
writer.SetInputData(polygonPolyData);
writer.SetFileName("polygon2D.vtk");
writer.SetFileTypeToBinary();
writer.Update();
writer = new vtkPolyDataWriter();
polygonPolyData = new vtkPolyData();
polygonPolyData.SetPoints(ref2DPoints);
writer.SetInputData(polygonPolyData);
writer.SetFileName("refPoints.vtk");
writer.SetFileTypeToBinary();
writer.Update();
*/
}
public boolean isEnclosed(double[] xyz) {
Vector3D point = new Vector3D(xyz);
Point2D projected = proj.forward(point.getDelta(), point.getAlpha());
return polygon.contains(projected.getX(), projected.getY());
}
/**
* @param inputPoints points to consider
* @param scale scale factor
* @return indices of all points inside the scaled polygon
*/
public List<Integer> scalePoints(List<double[]> inputPoints, double scale) {
List<Vector2D> projected = new ArrayList<>();
for (double[] inputPoint : inputPoints) {
Vector3D point = new Vector3D(inputPoint);
Point2D projectedPoint = proj.forward(point.getDelta(), point.getAlpha());
projected.add(new Vector2D(projectedPoint.getX(), projectedPoint.getY()));
}
Vector2D center = new Vector2D(0, 0);
for (Vector2D inputPoint : projected) center = center.add(inputPoint);
center = center.scalarMultiply(1. / inputPoints.size());
List<Vector2D> translatedPoints = new ArrayList<>();
for (Vector2D inputPoint : projected) translatedPoints.add(inputPoint.subtract(center));
Path2D.Double thisPolygon = null;
MonotoneChain mc = new MonotoneChain();
Collection<Vector2D> vertices = mc.findHullVertices(translatedPoints);
for (Vector2D vertex : vertices) {
if (thisPolygon == null) {
thisPolygon = new Path2D.Double();
thisPolygon.moveTo(scale * vertex.getX(), scale * vertex.getY());
} else {
thisPolygon.lineTo(scale * vertex.getX(), scale * vertex.getY());
}
}
thisPolygon.closePath();
List<Integer> indices = new ArrayList<>();
for (int i = 0; i < projected.size(); i++) {
Vector2D inputPoint = projected.get(i);
if (thisPolygon.contains(inputPoint.getX() - center.getX(), inputPoint.getY() - center.getY()))
indices.add(i);
}
return indices;
}
private static Options defineOptions() {
Options options = new Options();
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc("Format of input file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(Option.builder("inllr")
.desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("referenceFormat")
.hasArg()
.desc("Format of reference file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("referenceFile")
.required()
.hasArg()
.desc("Required. Name of reference file.")
.build());
options.addOption(Option.builder("refllr")
.desc(
"If present, reference values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc("Format of output file. If not present format will be inferred from file extension.")
.build());
options.addOption(Option.builder("outputFile")
.required()
.hasArg()
.desc("Required. Name of output file.")
.build());
options.addOption(Option.builder("outllr")
.desc(
"If present, output values will be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(Option.builder("scale")
.hasArg()
.desc("Value to scale bounding box containing intersect region. Default is 1.0.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException, IOException, InterruptedException, FitsException {
NativeLibraryLoader.loadVtkLibraries();
NativeLibraryLoader.loadSpiceLibraries();
TerrasaurTool defaultOBJ = new PointCloudOverlap(null);
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// Read the reference file
FORMATS refFormat = cl.hasOption("referenceFormat")
? FORMATS.valueOf(cl.getOptionValue("referenceFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("referenceFile"));
String refFile = cl.getOptionValue("referenceFile");
boolean refLLR = cl.hasOption("refllr");
PointCloudFormatConverter pcfc = new PointCloudFormatConverter(refFormat, FORMATS.VTK);
pcfc.read(refFile, refLLR);
vtkPoints referencePoints = pcfc.getPoints();
logger.info("{} points read from {}", referencePoints.GetNumberOfPoints(), refFile);
List<double[]> refPts = new ArrayList<>();
for (int i = 0; i < referencePoints.GetNumberOfPoints(); i++) {
refPts.add(referencePoints.GetPoint(i));
}
// create the overlap object and set the enclosing polygon
PointCloudOverlap pco = new PointCloudOverlap(refPts);
// Read the input point cloud
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("inputFile"));
String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr");
pcfc = new PointCloudFormatConverter(inFormat, FORMATS.VTK);
pcfc.read(inFile, inLLR);
vtkPoints inputPoints = pcfc.getPoints();
logger.info("{} points read from {}", inputPoints.GetNumberOfPoints(), inFile);
List<Integer> enclosedIndices = new ArrayList<>();
for (int i = 0; i < inputPoints.GetNumberOfPoints(); i++) {
double[] pt = inputPoints.GetPoint(i);
if (pco.isEnclosed(pt)) enclosedIndices.add(i);
}
if (cl.hasOption("scale")) {
List<double[]> pts = new ArrayList<>();
for (Integer i : enclosedIndices) pts.add(inputPoints.GetPoint(i));
// this list includes which of the enclosed points are inside the scaled polygon
List<Integer> theseIndices = pco.scalePoints(pts, Double.parseDouble(cl.getOptionValue("scale")));
// now relate this list back to the original list of points
List<Integer> newIndices = new ArrayList<>();
for (Integer i : theseIndices) newIndices.add(enclosedIndices.get(i));
enclosedIndices = newIndices;
}
VectorStatistics xyzStats = new VectorStatistics();
VectorStatistics xyStats = new VectorStatistics();
for (Integer i : enclosedIndices) {
double[] thisPt = inputPoints.GetPoint(i);
Vector3D thisPt3D = new Vector3D(thisPt);
xyzStats.add(thisPt3D);
Point2D projectedPt = pco.getProjection().forward(thisPt3D.getDelta(), thisPt3D.getAlpha());
xyStats.add(new Vector3(projectedPt.getX(), projectedPt.getY(), 0));
}
logger.info(
"Center XYZ: {}, {}, {}",
xyzStats.getMean().getX(),
xyzStats.getMean().getY(),
xyzStats.getMean().getZ());
Vector3D centerXYZ = xyzStats.getMean();
logger.info(
"Center lon, lat: {}, {}\n",
Math.toDegrees(centerXYZ.getAlpha()),
Math.toDegrees(centerXYZ.getDelta()));
logger.info(
"xmin/xmax/extent: {}/{}/{}\n",
xyzStats.getMin().getX(),
xyzStats.getMax().getX(),
xyzStats.getMax().getX() - xyzStats.getMin().getX());
logger.info(
"ymin/ymax/extent: {}/{}/{}\n",
xyzStats.getMin().getY(),
xyzStats.getMax().getY(),
xyzStats.getMax().getY() - xyzStats.getMin().getY());
logger.info(
"zmin/zmax/extent: {}/{}/{}\n",
xyzStats.getMin().getZ(),
xyzStats.getMax().getZ(),
xyzStats.getMax().getZ() - xyzStats.getMin().getZ());
FORMATS outFormat = cl.hasOption("outputFormat")
? FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase())
: FORMATS.formatFromExtension(cl.getOptionValue("outputFile"));
vtkPoints pointsToWrite = new vtkPoints();
for (Integer i : enclosedIndices) pointsToWrite.InsertNextPoint(inputPoints.GetPoint(i));
pcfc = new PointCloudFormatConverter(FORMATS.VTK, outFormat);
pcfc.setPoints(pointsToWrite);
String outputFilename = cl.getOptionValue("outputFile");
pcfc.write(outputFilename, cl.hasOption("outllr"));
if (new File(outputFilename).exists()) {
logger.info("{} points written to {}", pointsToWrite.GetNumberOfPoints(), outputFilename);
} else {
logger.error("Could not write {}", outputFilename);
}
logger.info("Finished");
}
}
// TODO write out center of output pointcloud

View File

@@ -156,8 +156,7 @@ public class PointCloudToPlane implements TerrasaurTool {
if (dist > halfExtent) halfExtent = dist;
}
ProjectionOrthographic p =
new ProjectionOrthographic(
ProjectionOrthographic p = new ProjectionOrthographic(
config.width(),
config.height(),
CoordConverters.convertToLatitudinal(
@@ -169,8 +168,7 @@ public class PointCloudToPlane implements TerrasaurTool {
canvas.plot(data);
((MapPlot) canvas).drawLatLonGrid(Math.toRadians(5), Math.toRadians(5), true);
canvas.drawColorBar(
ImmutableColorBar.builder()
canvas.drawColorBar(ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp)
.numTicks(5)
@@ -196,8 +194,7 @@ public class PointCloudToPlane implements TerrasaurTool {
canvas.drawAxes();
canvas.plot(data);
canvas.drawColorBar(
ImmutableColorBar.builder()
canvas.drawColorBar(ImmutableColorBar.builder()
.rect(new Rectangle(config.leftMargin(), 40, config.width(), 10))
.ramp(ramp)
.numTicks(5)
@@ -209,72 +206,54 @@ public class PointCloudToPlane implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("inputFormat")
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc(
"Format of input file. If not present format is inferred from inputFile extension.")
.desc("Format of input file. If not present format is inferred from inputFile extension.")
.build());
options.addOption(
Option.builder("inputFile")
options.addOption(Option.builder("inputFile")
.required()
.hasArg()
.desc("Required. Name of input file.")
.build());
options.addOption(
Option.builder("inllr")
options.addOption(Option.builder("inllr")
.desc(
"If present, input values are assumed to be lon, lat, rad. Default is x, y, z. Only used with ASCII or BINARY formats.")
.build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("outputFile")
options.addOption(Option.builder("outputFile")
.hasArg()
.desc(
"Name of output file to contain 4x4 transformation matrix. The top left 3x3 matrix is the rotation matrix. The top "
+ "three entries in the right hand column are the translation vector. The bottom row is always 0 0 0 1.\nTo convert "
+ "from global to local:\n transformed = rotation.mxv(point.sub(translation))")
.build());
options.addOption(
Option.builder("translate")
options.addOption(Option.builder("translate")
.hasArg()
.desc(
"Translate surface points and spacecraft position. "
.desc("Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas. "
+ "Default is to use centroid of input point cloud.")
.build());
options.addOption(
Option.builder("plotXYZ")
options.addOption(Option.builder("plotXYZ")
.hasArg()
.desc(
"Plot X vs Y (in the local frame) colored by Z. "
+ "Argument is the name of PNG file to write.")
.desc("Plot X vs Y (in the local frame) colored by Z. " + "Argument is the name of PNG file to write.")
.build());
options.addOption(
Option.builder("plotXYR")
options.addOption(Option.builder("plotXYR")
.hasArg()
.desc(
"Plot X vs Y (in the local frame) colored by R. "
+ "Argument is the name of PNG file to write.")
.desc("Plot X vs Y (in the local frame) colored by R. " + "Argument is the name of PNG file to write.")
.build());
options.addOption(
Option.builder("slope")
.desc(
"Choose local coordinate frame such that Z points normal to the plane "
options.addOption(Option.builder("slope")
.desc("Choose local coordinate frame such that Z points normal to the plane "
+ "and X points along the direction of steepest descent.")
.build());
return options;
@@ -297,8 +276,7 @@ public class PointCloudToPlane implements TerrasaurTool {
String inFile = cl.getOptionValue("inputFile");
boolean inLLR = cl.hasOption("inllr");
FORMATS inFormat =
cl.hasOption("inputFormat")
FORMATS inFormat = cl.hasOption("inputFormat")
? FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
: FORMATS.formatFromExtension(inFile);
@@ -314,29 +292,23 @@ public class PointCloudToPlane implements TerrasaurTool {
Vector3 translation;
if (cl.hasOption("translate")) {
translation =
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
pctp.getGMU().setTranslation(translation.toArray());
}
pctp.getGMU().calculateTransformation();
List<Vector3> globalPts = new ArrayList<>();
for (int i = 0; i < points.GetNumberOfPoints(); i++)
globalPts.add(new Vector3(points.GetPoint(i)));
for (int i = 0; i < points.GetNumberOfPoints(); i++) globalPts.add(new Vector3(points.GetPoint(i)));
double[][] transformation = pctp.getGMU().getTransformation();
StringBuilder sb =
new StringBuilder(
String.format(
StringBuilder sb = new StringBuilder(String.format(
"translation vector:\n%24.16e%24.16e%24.16e\n",
transformation[0][3], transformation[1][3], transformation[2][3]));
logger.info(sb.toString());
sb = new StringBuilder("rotation matrix:\n");
for (int i = 0; i < 3; i++)
sb.append(
String.format(
"%24.16e%24.16e%24.16e\n",
transformation[i][0], transformation[i][1], transformation[i][2]));
sb.append(String.format(
"%24.16e%24.16e%24.16e\n", transformation[i][0], transformation[i][1], transformation[i][2]));
logger.info(sb.toString());
Matrix33 rotation = new Matrix33(pctp.getGMU().getRotation());

View File

@@ -24,7 +24,6 @@ package terrasaur.apps;
import java.util.ArrayList;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
@@ -63,8 +62,11 @@ public class PrintShapeModelStatistics implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("objFile").required().hasArg().desc("Path to OBJ file.").build());
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Path to OBJ file.")
.build());
return options;
}

View File

@@ -172,8 +172,7 @@ public class RangeFromSumFile implements TerrasaurTool {
polyData.GetPoint(id1, pt1);
polyData.GetPoint(id2, pt2);
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
Vector3 center3 = MathConversions.toVector3(facet.getCenter());
Vector3D center3D = MathConversions.toVector3D(facet.getCenter());
@@ -185,7 +184,6 @@ public class RangeFromSumFile implements TerrasaurTool {
tiltDir = Tilts.basicTiltDirDeg(surfaceIntercept.getAlpha(), normal3D);
incidence = Vector3D.angle(sunXYZ, normal3D);
emission = Vector3D.angle(scPos, normal3D);
phase = Vector3D.angle(sunXYZ, scPos.subtract(center3D));
@@ -193,7 +191,8 @@ public class RangeFromSumFile implements TerrasaurTool {
try {
// scPos is in body fixed coordinates
Plane p = new Plane(normal3, center3);
Vector3 projectedNorth = p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
Vector3 projectedNorth =
p.project(new Vector3(0, 0, 1).add(center3)).sub(center3);
Vector3 projected = p.project(MathConversions.toVector3(scPos)).sub(center3);
scAzimuth = projected.sep(projectedNorth);
@@ -201,7 +200,8 @@ public class RangeFromSumFile implements TerrasaurTool {
scElevation = Math.PI / 2 - emission;
// sunXYZ is a unit vector pointing to the sun
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3)).sub(center3);
projected = p.project(MathConversions.toVector3(sunXYZ).add(center3))
.sub(center3);
sunAzimuth = projected.sep(projectedNorth);
if (projected.cross(projectedNorth).dot(center3) < 0) sunAzimuth = 2 * Math.PI - sunAzimuth;
@@ -295,53 +295,45 @@ public class RangeFromSumFile implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("sumFile")
options.addOption(Option.builder("sumFile")
.required()
.hasArg()
.desc("Required. Name of sum file to read.")
.build());
options.addOption(
Option.builder("objFile")
options.addOption(Option.builder("objFile")
.required()
.hasArg()
.desc("Required. Name of OBJ shape file.")
.build());
options.addOption(
Option.builder("pixelOffset")
options.addOption(Option.builder("pixelOffset")
.hasArg()
.desc(
"Pixel offset from center of image, given as a comma separated pair (no spaces). Default is 0,0. "
+ "x increases to the right and y increases down.")
.build());
options.addOption(
Option.builder("xRange")
options.addOption(Option.builder("xRange")
.hasArg()
.desc(
"Range of X pixel offsets from center of image, given as a comma separated triplet (xStart, xStop, xSpacing with no spaces). "
+ "For example -50,50,5.")
.build());
options.addOption(
Option.builder("yRange")
options.addOption(Option.builder("yRange")
.hasArg()
.desc(
"Range of Y pixel offsets from center of image, given as a comma separated triplet (yStart, yStop, ySpacing with no spaces). "
+ "For example -50,50,5.")
.build());
options.addOption(
Option.builder("radius")
options.addOption(Option.builder("radius")
.hasArg()
.desc(
"Evaluate all pixels within specified distance (in pixels) of desired pixel. This value will be rounded to the nearest integer.")
.build());
options.addOption(
Option.builder("distanceScale")
options.addOption(Option.builder("distanceScale")
.hasArg()
.desc(
"Spacecraft position is assumed to be in kilometers. If not, scale by this value (e.g. Use 0.001 if s/c pos is in meters).")
.build());
options.addOption(
Option.builder("stats")
options.addOption(Option.builder("stats")
.desc("Print out statistics about range to all selected pixels.")
.build());
return options;
@@ -417,8 +409,7 @@ public class RangeFromSumFile implements TerrasaurTool {
if (checkRadius > 0) {
double midx = (xStart + xStop) / 2.;
double midy = (yStart + yStop) / 2.;
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius)
continue;
if ((ix - midx) * (ix - midx) + (iy - midy) * (iy - midy) > checkRadius * checkRadius) continue;
}
long cellID = rfsf.findIntercept(ix, iy).getKey();
if (cellID > -1) System.out.println(rfsf);

View File

@@ -155,9 +155,13 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3D thisPoint = new Vector3D(points.GetPoint(i));
if (scale != null)
thisPoint = thisPoint.subtract(center).scalarMultiply(scale).add(center);
thisPoint = thisPoint
.subtract(center)
.scalarMultiply(scale)
.add(center);
if (rotation != null)
thisPoint = rotation.applyTo(thisPoint.subtract(center)).add(center);
thisPoint =
rotation.applyTo(thisPoint.subtract(center)).add(center);
points.SetPoint(i, thisPoint.toArray());
}
}
@@ -254,11 +258,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
* @return Brightness structure
*/
private Brightness getBrightness(
PhotometricFunction pf,
SmallBodyModel sbm,
long intersect,
Vector3D intersectPoint,
boolean isDefault) {
PhotometricFunction pf, SmallBodyModel sbm, long intersect, Vector3D intersectPoint, boolean isDefault) {
Vector3D facetToCamera = cameraXYZ.subtract(intersectPoint);
@@ -269,8 +269,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
double distFromCamera = facetToCamera.getNorm();
// speeds up calculation along the limb. Need to combine all pixels in the ifov.
double kmPerPixel =
ifov * distFromCamera / Math.abs(FastMath.cos(Math.min(Math.toRadians(60), emission)));
double kmPerPixel = ifov * distFromCamera / Math.abs(FastMath.cos(Math.min(Math.toRadians(60), emission)));
double sum = 0;
@@ -304,16 +303,13 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
}
}
double albedo = (isDefault && albedoMap.containsKey(cell)) ? albedoMap.get(cell) : 1;
sum +=
albedo
* pf.getValue(
FastMath.cos(incidence), FastMath.cos(emission), FastMath.toDegrees(phase));
sum += albedo * pf.getValue(FastMath.cos(incidence), FastMath.cos(emission), FastMath.toDegrees(phase));
}
logger.printf(
Level.DEBUG,
"Thread %d lat/lon %.2f/%.2f, %s, sum %f, cells %d, %.2f",
Thread.currentThread().getId(),
Thread.currentThread().threadId(),
Math.toDegrees(intersectPoint.getDelta()),
Math.toDegrees(intersectPoint.getAlpha()),
intersectPoint.toString(),
@@ -321,8 +317,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
cells.size(),
sum / cells.size());
return new Brightness(
incidence, emission, phase, sum / cells.size(), distFromCamera, facetToCamera, normal);
return new Brightness(incidence, emission, phase, sum / cells.size(), distFromCamera, facetToCamera, normal);
}
class BrightnessCalculator implements Callable<Map<Integer, Brightness>> {
@@ -338,7 +333,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
@Override
public Map<Integer, Brightness> call() throws Exception {
logger.info("Thread {}: starting", Thread.currentThread().getId());
logger.info("Thread {}: starting", Thread.currentThread().threadId());
int xPixels = subPixel * nPixelsX;
@@ -352,8 +347,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
int i = index % xPixels;
Vector3D pixelDir = pixelToBodyFixed(((double) i) / subPixel, ((double) j) / subPixel);
long intersect =
globalModel.computeRayIntersection(cameraXYZArray, pixelDir.toArray(), intersectPoint);
long intersect = globalModel.computeRayIntersection(cameraXYZArray, pixelDir.toArray(), intersectPoint);
if (intersect > -1) {
@@ -371,16 +365,14 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
SmallBodyModel localModel = lmc.get(intersectPt3D);
if (localModel != null) {
long localIntersect =
localModel.computeRayIntersection(
long localIntersect = localModel.computeRayIntersection(
cameraXYZArray, pixelDir.toArray(), localIntersectPoint);
if (localIntersect != -1) {
break;
} else {
logger.debug(
String.format(
logger.debug(String.format(
"Thread %d: No intersection with local model for pixel (%d,%d): lat/lon %.2f/%.2f, using global intersection %d %s",
Thread.currentThread().getId(),
Thread.currentThread().threadId(),
i,
j,
Math.toDegrees(intersectPt3D.getDelta()),
@@ -397,7 +389,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
}
}
logger.info("Thread {}: finished", Thread.currentThread().getId());
logger.info("Thread {}: finished", Thread.currentThread().threadId());
return brightness;
}
@@ -411,7 +403,8 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
Map<Integer, Brightness> brightness = new HashMap<>();
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Integer> indices = IntStream.range(0, yPixels * xPixels).boxed().toList();
List<Integer> indices =
IntStream.range(0, yPixels * xPixels).boxed().toList();
int numPixels = indices.size();
@@ -467,7 +460,8 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Integer> indices = IntStream.range(0, yPixels * xPixels).boxed().toList();
List<Integer> indices =
IntStream.range(0, yPixels * xPixels).boxed().toList();
int numPixels = indices.size();
@@ -529,8 +523,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
}
BufferedImage img = new BufferedImage(nPixelsX, nPixelsY, BufferedImage.TYPE_INT_RGB);
img.createGraphics()
.drawImage(image.getScaledInstance(nPixelsX, nPixelsY, Image.SCALE_SMOOTH), 0, 0, null);
img.createGraphics().drawImage(image.getScaledInstance(nPixelsX, nPixelsY, Image.SCALE_SMOOTH), 0, 0, null);
return img;
}
@@ -559,8 +552,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
double[] intersectPoint = new double[3];
Vector3D boresight = sumFile.boresight();
long intersect =
getGlobalModel()
.computeRayIntersection(cameraXYZ.toArray(), boresight.toArray(), intersectPoint);
getGlobalModel().computeRayIntersection(cameraXYZ.toArray(), boresight.toArray(), intersectPoint);
if (intersect > 0) {
Vector3D nadirPt = new Vector3D(intersectPoint);
double lat = nadirPt.getDelta();
@@ -573,17 +565,20 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
double range = cameraXYZ.subtract(nadirPt).getNorm();
addMetaData("image.cell", "Index of center pixel cell", Long.toString(intersect));
addMetaData("image.lat", "Center latitude", StringFunctions.toDegreesLat("%.2f ").apply(lat));
addMetaData(
"image.lon", "Center longitude", StringFunctions.toDegreesELon("%.2f ").apply(lon));
"image.lat",
"Center latitude",
StringFunctions.toDegreesLat("%.2f ").apply(lat));
addMetaData(
"image.inc", "Center incidence in degrees", String.format("%.2f", Math.toDegrees(inc)));
"image.lon",
"Center longitude",
StringFunctions.toDegreesELon("%.2f ").apply(lon));
addMetaData("image.inc", "Center incidence in degrees", String.format("%.2f", Math.toDegrees(inc)));
addMetaData(
"image.ems",
"Center emission in degrees (may not be zero if facet is tilted)",
String.format("%.2f", Math.toDegrees(ems)));
addMetaData(
"image.phs", "Center phase in degrees", String.format("%.2f", Math.toDegrees(phs)));
addMetaData("image.phs", "Center phase in degrees", String.format("%.2f", Math.toDegrees(phs)));
addMetaData("image.range", "Center point range in m", String.format("%.3f", range * 1e3));
addMetaData(
"image.resolution",
@@ -659,8 +654,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("albedoFile")
options.addOption(Option.builder("albedoFile")
.hasArg()
.desc(
"""
@@ -674,8 +668,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ")
.strip())
.build());
options.addOption(
Option.builder("localModels")
options.addOption(Option.builder("localModels")
.hasArgs()
.desc(
"""
@@ -695,46 +688,38 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ")
.strip())
.build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("model")
options.addOption(Option.builder("model")
.required()
.hasArg()
.desc(
"Required. Default shape model filename. Supported formats are OBJ, PLT, PLY, STL, or VTK format.")
.build());
options.addOption(
Option.builder("numThreads")
options.addOption(Option.builder("numThreads")
.hasArg()
.desc("Number of threads to run in parallel when generating the image. Default is 2.")
.build());
options.addOption(
Option.builder("photo")
options.addOption(Option.builder("photo")
.hasArg()
.desc(PhotometricFunction.getOptionString().trim() + " Default is OREX.")
.build());
options.addOption(
Option.builder("output")
options.addOption(Option.builder("output")
.required()
.hasArg()
.desc("Required. Name of image file to write. Valid extensions are fits or png.")
.build());
options.addOption(
Option.builder("rotateModel")
options.addOption(Option.builder("rotateModel")
.hasArg()
.desc(
"""
@@ -745,13 +730,11 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ")
.strip())
.build());
options.addOption(
Option.builder("scaleModel")
options.addOption(Option.builder("scaleModel")
.hasArg()
.desc("If present, factor to scale shape model. The center is unchanged.")
.build());
options.addOption(
Option.builder("subPixel")
options.addOption(Option.builder("subPixel")
.hasArg()
.desc(
"""
@@ -764,8 +747,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
.replaceAll("\\s+", " ")
.strip())
.build());
options.addOption(
Option.builder("sumFile")
options.addOption(Option.builder("sumFile")
.required()
.hasArg()
.desc("Required. Name of sum file to read.")
@@ -786,15 +768,11 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
NativeLibraryLoader.loadVtkLibraries();
Double scale =
cl.hasOption("scaleModel") ? Double.parseDouble(cl.getOptionValue("scaleModel")) : null;
Double scale = cl.hasOption("scaleModel") ? Double.parseDouble(cl.getOptionValue("scaleModel")) : null;
Rotation rotation =
cl.hasOption("rotateModel")
? RotationUtils.stringToRotation(cl.getOptionValue("rotateModel"))
: null;
cl.hasOption("rotateModel") ? RotationUtils.stringToRotation(cl.getOptionValue("rotateModel")) : null;
RenderShapeFromSumFile app =
new RenderShapeFromSumFile(cl.getOptionValue("model"), scale, rotation);
RenderShapeFromSumFile app = new RenderShapeFromSumFile(cl.getOptionValue("model"), scale, rotation);
SumFile sumFile = app.loadSumFile(cl.getOptionValue("sumFile"));
if (cl.hasOption("albedoFile")) app.loadAlbedoFile(cl.getOptionValue("albedoFile"));
@@ -805,10 +783,8 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
if (cl.hasOption("subPixel")) app.setSubPixel(Integer.parseInt(cl.getOptionValue("subPixel")));
PhotometricFunction pf = PhotometricFunction.OREX1;
if (cl.hasOption("photo"))
pf = PhotometricFunction.getPhotometricFunction(cl.getOptionValue("photo"));
int numThreads =
cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 2;
if (cl.hasOption("photo")) pf = PhotometricFunction.getPhotometricFunction(cl.getOptionValue("photo"));
int numThreads = cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 2;
String outputFilename = cl.getOptionValue("output");
String dirname = FilenameUtils.getPath(outputFilename);
@@ -828,8 +804,7 @@ public class RenderShapeFromSumFile implements TerrasaurTool {
Fits fits = new Fits();
ImageHDU imageHDU = (ImageHDU) Fits.makeHDU(app.getFits(pf, numThreads));
Header header = imageHDU.getHeader();
header.addValue(
DateTime.TIMESYS_UTC, app.metadata.get("image.utc").getValue(), "Time from the SUM file");
header.addValue(DateTime.TIMESYS_UTC, app.metadata.get("image.utc").getValue(), "Time from the SUM file");
header.addValue("TITLE", sumFile.picnm(), "Title of SUM file");
header.addValue("PLANE1", "brightness", "from 0 to 1");
header.addValue("PLANE2", "incidence", "degrees");

View File

@@ -62,57 +62,87 @@ public class RotationConversion implements TerrasaurTool {
""";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("angle").hasArg().desc("Rotation angle, in radians.").build());
options.addOption(
Option.builder("axis0").hasArg().desc("First element of rotation axis.").build());
options.addOption(
Option.builder("axis1").hasArg().desc("Second element of rotation axis.").build());
options.addOption(
Option.builder("axis2").hasArg().desc("Third element of rotation axis.").build());
options.addOption(Option.builder("cardanXYZ1").hasArg()
.desc("Cardan angle for the first rotation (about the X axis) in radians.").build());
options.addOption(Option.builder("cardanXYZ2").hasArg()
.desc("Cardan angle for the second rotation (about the Y axis) in radians.").build());
options.addOption(Option.builder("cardanXYZ3").hasArg()
.desc("Cardan angle for the third rotation (about the Z axis) in radians.").build());
options.addOption(Option.builder("eulerZXZ1").hasArg()
.desc("Euler angle for the first rotation (about the Z axis) in radians.").build());
options.addOption(Option.builder("eulerZXZ2").hasArg()
options.addOption(Option.builder("angle")
.hasArg()
.desc("Rotation angle, in radians.")
.build());
options.addOption(Option.builder("axis0")
.hasArg()
.desc("First element of rotation axis.")
.build());
options.addOption(Option.builder("axis1")
.hasArg()
.desc("Second element of rotation axis.")
.build());
options.addOption(Option.builder("axis2")
.hasArg()
.desc("Third element of rotation axis.")
.build());
options.addOption(Option.builder("cardanXYZ1")
.hasArg()
.desc("Cardan angle for the first rotation (about the X axis) in radians.")
.build());
options.addOption(Option.builder("cardanXYZ2")
.hasArg()
.desc("Cardan angle for the second rotation (about the Y axis) in radians.")
.build());
options.addOption(Option.builder("cardanXYZ3")
.hasArg()
.desc("Cardan angle for the third rotation (about the Z axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ1")
.hasArg()
.desc("Euler angle for the first rotation (about the Z axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ2")
.hasArg()
.desc("Euler angle for the second rotation (about the rotated X axis) in radians.")
.build());
options.addOption(Option.builder("eulerZXZ3").hasArg()
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.").build());
options.addOption(
Option.builder("q0").hasArg().desc("Scalar term for quaternion: cos(theta/2)").build());
options.addOption(Option.builder("q1").hasArg()
.desc("First vector term for quaternion: sin(theta/2) * V[0]").build());
options.addOption(Option.builder("q2").hasArg()
.desc("Second vector term for quaternion: sin(theta/2) * V[1]").build());
options.addOption(Option.builder("q3").hasArg()
.desc("Third vector term for quaternion: sin(theta/2) * V[2]").build());
options.addOption(Option.builder("matrix").hasArg()
options.addOption(Option.builder("eulerZXZ3")
.hasArg()
.desc("Euler angle for the third rotation (about the rotated Z axis) in radians.")
.build());
options.addOption(Option.builder("q0")
.hasArg()
.desc("Scalar term for quaternion: cos(theta/2)")
.build());
options.addOption(Option.builder("q1")
.hasArg()
.desc("First vector term for quaternion: sin(theta/2) * V[0]")
.build());
options.addOption(Option.builder("q2")
.hasArg()
.desc("Second vector term for quaternion: sin(theta/2) * V[1]")
.build());
options.addOption(Option.builder("q3")
.hasArg()
.desc("Third vector term for quaternion: sin(theta/2) * V[2]")
.build());
options.addOption(Option.builder("matrix")
.hasArg()
.desc("name of file containing rotation matrix to convert to Euler angles. "
+ "Format is 3x3 array in plain text separated by white space.")
.build());
options.addOption(Option.builder("anglesInDegrees").desc(
"If present, input angles in degrees and print output angles in degrees. Default is false.")
.build()); return options;
options.addOption(Option.builder("anglesInDegrees")
.desc("If present, input angles in degrees and print output angles in degrees. Default is false.")
.build());
return options;
}
public static void main(String[] args) throws Exception {
@@ -127,14 +157,11 @@ public class RotationConversion implements TerrasaurTool {
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
boolean inDegrees = cl.hasOption("anglesInDegrees");
boolean axisAndAngle = cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1")
&& cl.hasOption("axis2");
boolean cardanXYZ =
cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
boolean eulerZXZ =
cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
boolean quaternion =
cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
boolean axisAndAngle =
cl.hasOption("angle") && cl.hasOption("axis0") && cl.hasOption("axis1") && cl.hasOption("axis2");
boolean cardanXYZ = cl.hasOption("cardanXYZ1") && cl.hasOption("cardanXYZ2") && cl.hasOption("cardanXYZ3");
boolean eulerZXZ = cl.hasOption("eulerZXZ1") && cl.hasOption("eulerZXZ3") && cl.hasOption("eulerZXZ3");
boolean quaternion = cl.hasOption("q0") && cl.hasOption("q1") && cl.hasOption("q2") && cl.hasOption("q3");
boolean matrix = cl.hasOption("matrix");
if (!(axisAndAngle || cardanXYZ || eulerZXZ || quaternion || matrix)) {
@@ -145,26 +172,25 @@ public class RotationConversion implements TerrasaurTool {
Rotation r = null;
if (matrix) {
List<String> lines =
FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
List<String> lines = FileUtils.readLines(new File(cl.getOptionValue("matrix")), Charset.defaultCharset());
double[][] m = new double[3][3];
for (int i = 0; i < 3; i++) {
String[] parts = lines.get(i).trim().split("\\s+");
for (int j = 0; j < 3; j++)
m[i][j] = Double.parseDouble(parts[j].trim());
for (int j = 0; j < 3; j++) m[i][j] = Double.parseDouble(parts[j].trim());
}
r = new Rotation(m, 1e-10);
}
if (axisAndAngle) {
double angle = Double.parseDouble(cl.getOptionValue("angle").trim());
if (inDegrees)
angle = Math.toRadians(angle);
if (inDegrees) angle = Math.toRadians(angle);
r = new Rotation(
new Vector3D(Double.parseDouble(cl.getOptionValue("axis0").trim()),
new Vector3D(
Double.parseDouble(cl.getOptionValue("axis0").trim()),
Double.parseDouble(cl.getOptionValue("axis1").trim()),
Double.parseDouble(cl.getOptionValue("axis2").trim())),
angle, RotationConvention.FRAME_TRANSFORM);
angle,
RotationConvention.FRAME_TRANSFORM);
}
if (cardanXYZ) {
@@ -176,8 +202,7 @@ public class RotationConversion implements TerrasaurTool {
angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3);
}
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2,
angle3);
r = new Rotation(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
}
if (eulerZXZ) {
@@ -189,15 +214,16 @@ public class RotationConversion implements TerrasaurTool {
angle2 = Math.toRadians(angle2);
angle3 = Math.toRadians(angle3);
}
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2,
angle3);
r = new Rotation(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM, angle1, angle2, angle3);
}
if (quaternion) {
r = new Rotation(Double.parseDouble(cl.getOptionValue("q0").trim()),
r = new Rotation(
Double.parseDouble(cl.getOptionValue("q0").trim()),
Double.parseDouble(cl.getOptionValue("q1").trim()),
Double.parseDouble(cl.getOptionValue("q2").trim()),
Double.parseDouble(cl.getOptionValue("q3").trim()), true);
Double.parseDouble(cl.getOptionValue("q3").trim()),
true);
}
double[][] m = r.getMatrix();
@@ -207,19 +233,20 @@ public class RotationConversion implements TerrasaurTool {
System.out.println(matrixString);
String axisAndAngleString = inDegrees
? String.format("angle (degrees), axis:\n%g, %s", Math.toDegrees(r.getAngle()),
r.getAxis(RotationConvention.FRAME_TRANSFORM))
: String.format("angle (radians), axis:\n%g, %s", r.getAngle(),
r.getAxis(RotationConvention.FRAME_TRANSFORM));
? String.format(
"angle (degrees), axis:\n%g, %s",
Math.toDegrees(r.getAngle()), r.getAxis(RotationConvention.FRAME_TRANSFORM))
: String.format(
"angle (radians), axis:\n%g, %s", r.getAngle(), r.getAxis(RotationConvention.FRAME_TRANSFORM));
System.out.println(axisAndAngleString);
try {
double[] angles = r.getAngles(RotationOrder.XYZ, RotationConvention.FRAME_TRANSFORM);
String cardanString = inDegrees
? String.format("Cardan XYZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]),
Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1],
angles[2]);
? String.format(
"Cardan XYZ angles (degrees):\n%g, %g, %g",
Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Cardan XYZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
System.out.println(cardanString);
} catch (CardanEulerSingularityException e) {
System.out.println("Cardan angles: encountered singularity, cannot solve");
@@ -228,10 +255,10 @@ public class RotationConversion implements TerrasaurTool {
try {
double[] angles = r.getAngles(RotationOrder.ZXZ, RotationConvention.FRAME_TRANSFORM);
String eulerString = inDegrees
? String.format("Euler ZXZ angles (degrees):\n%g, %g, %g", Math.toDegrees(angles[0]),
Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1],
angles[2]);
? String.format(
"Euler ZXZ angles (degrees):\n%g, %g, %g",
Math.toDegrees(angles[0]), Math.toDegrees(angles[1]), Math.toDegrees(angles[2]))
: String.format("Euler ZXZ angles (radians):\n%g, %g, %g", angles[0], angles[1], angles[2]);
System.out.println(eulerString);
} catch (CardanEulerSingularityException e) {
System.out.println("Euler angles: encountered singularity, cannot solve");

View File

@@ -44,7 +44,7 @@ import terrasaur.utils.math.MathConversions;
public class SPKFromSumFile implements TerrasaurTool {
private final static Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
@@ -54,7 +54,8 @@ public class SPKFromSumFile implements TerrasaurTool {
@Override
public String fullDescription(Options options) {
String header = "";
String footer = """
String footer =
"""
Given three or more sumfiles, fit a parabola to the spacecraft
trajectory in J2000 and create an input file for MKSPK.
""";
@@ -72,8 +73,9 @@ public class SPKFromSumFile implements TerrasaurTool {
private SPKFromSumFile() {}
private SPKFromSumFile(Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap,
double extend) throws SpiceException {
private SPKFromSumFile(
Body observer, Body target, ReferenceFrame bodyFixed, Map<String, Double> weightMap, double extend)
throws SpiceException {
this.observer = observer;
this.target = target;
this.bodyFixed = bodyFixed;
@@ -102,8 +104,9 @@ public class SPKFromSumFile implements TerrasaurTool {
* @param velocityIsJ2000 if true, user-supplied velocity is in J2000 frame
* @return command to run MKSPK
*/
public String writeMKSPKFiles(String basename, List<String> comments, int degree, final Vector3 velocity,
boolean velocityIsJ2000) throws SpiceException {
public String writeMKSPKFiles(
String basename, List<String> comments, int degree, final Vector3 velocity, boolean velocityIsJ2000)
throws SpiceException {
String commentFile = basename + "-comments.txt";
String setupFile = basename + ".setup";
@@ -112,23 +115,27 @@ public class SPKFromSumFile implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(commentFile)) {
StringBuilder sb = new StringBuilder();
if (!comments.isEmpty()) {
for (String comment : comments)
sb.append(comment).append("\n");
for (String comment : comments) sb.append(comment).append("\n");
sb.append("\n");
}
sb.append(String.format("This SPK for %s was generated by fitting a parabola to each component of the " + "SCOBJ vector from " + "the following sumfiles:\n", target));
sb.append(String.format(
"This SPK for %s was generated by fitting a parabola to each component of the "
+ "SCOBJ vector from " + "the following sumfiles:\n",
target));
for (String sumFile : sumFilenames.values()) {
sb.append(String.format("\t%s\n", sumFile));
}
sb.append("The SCOBJ vector was transformed to J2000 and an aberration correction ");
sb.append(String.format("was applied to find the geometric position relative to %s before the parabola " + "fit. ", target.getName()));
sb.append(String.format("The period covered by this SPK is %s to %s.",
sb.append(String.format(
"was applied to find the geometric position relative to %s before the parabola " + "fit. ",
target.getName()));
sb.append(String.format(
"The period covered by this SPK is %s to %s.",
new TDBTime(interval.getBegin()).toUTCString("ISOC", 3),
new TDBTime(interval.getEnd()).toUTCString("ISOC", 3)));
String allComments = sb.toString();
for (String comment : allComments.split("\\r?\\n"))
pw.println(WordUtils.wrap(comment, 80));
for (String comment : allComments.split("\\r?\\n")) pw.println(WordUtils.wrap(comment, 80));
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
@@ -196,23 +203,37 @@ public class SPKFromSumFile implements TerrasaurTool {
PolynomialFunction zPos = new PolynomialFunction(zCoeff);
PolynomialFunction zVel = zPos.polynomialDerivative();
logger.info("Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
observer.getName(), target.getName());
logger.info(
"Polynomial fitting coefficients for geometric position of {} relative to {} in J2000:",
observer.getName(),
target.getName());
StringBuilder xMsg = new StringBuilder(String.format("X = %e ", xCoeff[0]));
StringBuilder yMsg = new StringBuilder(String.format("Y = %e ", yCoeff[0]));
StringBuilder zMsg = new StringBuilder(String.format("Z = %e ", zCoeff[0]));
for (int i = 1; i <= degree; i++) {
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(xCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(yCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ").append(String.format("%e ", Math.abs(zCoeff[i]))).append("t").append(i > 1 ? "^" + i : "").append(" ");
xMsg.append(xCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(xCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
yMsg.append(yCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(yCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
zMsg.append(zCoeff[i] < 0 ? "- " : "+ ")
.append(String.format("%e ", Math.abs(zCoeff[i])))
.append("t")
.append(i > 1 ? "^" + i : "")
.append(" ");
}
logger.info(xMsg);
logger.info(yMsg);
logger.info(zMsg);
logger.debug("");
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration " +
"correction=NONE for fit.");
logger.debug("NOTE: comparing aberration correction=LT+S positions from sumfile with aberration "
+ "correction=NONE for fit.");
for (Double t : sumFiles.keySet()) {
TDBTime tdb = new TDBTime(t);
SumFile sumFile = sumFiles.get(t);
@@ -234,32 +255,34 @@ public class SPKFromSumFile implements TerrasaurTool {
double vx = -xVel.value(t);
double vy = -yVel.value(t);
double vz = -zVel.value(t);
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t),
-zPos.value(t), vx, vy, vz);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
} else {
Vector3 thisVelocity = new Vector3(velocity);
if (!velocityIsJ2000) {
TDBTime tdb = new TDBTime(t);
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
thisVelocity =
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
}
double vx = thisVelocity.getElt(0);
double vy = thisVelocity.getElt(1);
double vz = thisVelocity.getElt(2);
pw.printf("%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n", t, -xPos.value(t), -yPos.value(t),
-zPos.value(t), vx, vy, vz);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t, -xPos.value(t), -yPos.value(t), -zPos.value(t), vx, vy, vz);
}
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
try (PrintWriter pw = new PrintWriter(basename + ".csv")) {
pw.println("# Note: fit quantities are without light time or aberration corrections");
pw.println("# SCOBJ");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x," +
" y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric " +
"J2000) x, y, z");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SCOBJ (body fixed) x, y, z, SCOBJ (J2000) x,"
+ " y, z, SCOBJ (Geometric J2000) x, y, z, Fit SCOBJ (body fixed) x, y, z, Fit SCOBJ (Geometric "
+ "J2000) x, y, z");
for (Double t : sumFiles.keySet()) {
SumFile sumFile = sumFiles.get(t);
pw.printf("%s,", sumFile.utcString());
@@ -300,8 +323,8 @@ public class SPKFromSumFile implements TerrasaurTool {
pw.println();
}
pw.println("\n# Velocity");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, " +
"y, z, Fit (J2000) x, y, z");
pw.println("# UTC, TDB, SumFile, SPICE (body fixed) x, y, z, SPICE (J2000) x, y, z, Fit (body fixed) x, "
+ "y, z, Fit (J2000) x, y, z");
for (Double t : sumFiles.keySet()) {
SumFile sumFile = sumFiles.get(t);
pw.printf("%s,", sumFile.utcString());
@@ -327,7 +350,8 @@ public class SPKFromSumFile implements TerrasaurTool {
if (velocity != null) {
Vector3 thisVelocity = new Vector3(velocity);
if (!velocityIsJ2000) {
thisVelocity = bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
thisVelocity =
bodyFixed.getPositionTransformation(J2000, tdb).mxv(velocity);
}
double vx = thisVelocity.getElt(0);
double vy = thisVelocity.getElt(1);
@@ -335,8 +359,8 @@ public class SPKFromSumFile implements TerrasaurTool {
velJ2000 = new Vector3(vx, vy, vz);
}
StateVector stateJ2000 = new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)),
velJ2000);
StateVector stateJ2000 =
new StateVector(new Vector3(xPos.value(t), yPos.value(t), zPos.value(t)), velJ2000);
velBodyFixed = j2000ToBodyFixed.mxv(stateJ2000).getVector3(1);
pw.printf("%s, ", velBodyFixed.getElt(0));
@@ -357,19 +381,39 @@ public class SPKFromSumFile implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("degree").hasArg().desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.").build());
options.addOption(Option.builder("extend").hasArg().desc("Extend SPK past the last sumFile by <arg> seconds. "
+ " Default is zero.").build());
options.addOption(Option.builder("frame").hasArg().desc("Name of body fixed frame. This will default to the "
+ "target's body fixed frame.").build());
options.addOption(Option.builder("logFile").hasArg().desc("If present, save screen output to log file.").build());
options.addOption(Option.builder("degree")
.hasArg()
.desc("Degree of polynomial used to fit sumFile locations" + "." + " Default is 2.")
.build());
options.addOption(Option.builder("extend")
.hasArg()
.desc("Extend SPK past the last sumFile by <arg> seconds. " + " Default is zero.")
.build());
options.addOption(Option.builder("frame")
.hasArg()
.desc("Name of body fixed frame. This will default to the " + "target's body fixed frame.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg().desc("If present, print messages above selected " +
"priority. Valid values are " + sb.toString().trim() + ". Default is INFO.").build());
options.addOption(Option.builder("observer").required().hasArg().desc("Required. SPICE ID for the observer.").build());
options.addOption(Option.builder("sumFile").hasArg().required().desc("""
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected " + "priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("observer")
.required()
.hasArg()
.desc("Required. SPICE ID for the observer.")
.build());
options.addOption(Option.builder("sumFile")
.hasArg()
.required()
.desc(
"""
File listing sumfiles to read. This is a text file,
one per line. You can include an optional weight
after each filename. The default weight is 1.0.
@@ -385,14 +429,31 @@ public class SPKFromSumFile implements TerrasaurTool {
D717506131G0.SUM
# Weight this last image less than the others
D717506132G0.SUM 0.25
""").build());
options.addOption(Option.builder("spice").required().hasArgs().desc("Required. SPICE metakernel file " +
"containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
+ "whitespace.").build());
options.addOption(Option.builder("target").required().hasArg().desc("Required. SPICE ID for the target.").build());
options.addOption(Option.builder("velocity").hasArgs().desc("Spacecraft velocity relative to target in the " + "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to " + "take the derivative of the fit position. Specify as three floating point values in km/sec," + "separated by whitespace.").build());
options.addOption(Option.builder("velocityJ2000").desc("If present, argument to -velocity is in J2000 frame. "
+ " Ignored if -velocity is not set.").build()); return options;
""")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArgs()
.desc("Required. SPICE metakernel file "
+ "containing body fixed frame and spacecraft kernels. Can specify more than one kernel, separated by "
+ "whitespace.")
.build());
options.addOption(Option.builder("target")
.required()
.hasArg()
.desc("Required. SPICE ID for the target.")
.build());
options.addOption(Option.builder("velocity")
.hasArgs()
.desc("Spacecraft velocity relative to target in the "
+ "body fixed frame. If present, use this fixed velocity in the MKSPK input file. Default is to "
+ "take the derivative of the fit position. Specify as three floating point values in km/sec,"
+ "separated by whitespace.")
.build());
options.addOption(Option.builder("velocityJ2000")
.desc("If present, argument to -velocity is in J2000 frame. " + " Ignored if -velocity is not set.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException {
@@ -408,11 +469,9 @@ public class SPKFromSumFile implements TerrasaurTool {
NativeLibraryLoader.loadSpiceLibraries();
final double extend = cl.hasOption("extend") ? Double.parseDouble(cl.getOptionValue("extend")) : 0;
for (String kernel : cl.getOptionValues("spice"))
KernelDatabase.load(kernel);
for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
Body observer = new Body(cl.getOptionValue("observer"));
Body target = new Body(cl.getOptionValue("target"));
@@ -468,5 +527,4 @@ public class SPKFromSumFile implements TerrasaurTool {
logger.info("Finished.");
}
}

View File

@@ -97,23 +97,19 @@ public class ShapeFormatConverter implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("centerOfRotation")
options.addOption(Option.builder("centerOfRotation")
.hasArg()
.desc(
"Subtract this point before applying rotation matrix, add back after. "
+ "Specify by three floating point numbers separated by commas. If not present default is (0,0,0).")
.build());
options.addOption(
Option.builder("decimate")
options.addOption(Option.builder("decimate")
.hasArg()
.desc(
"Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
.desc("Reduce the number of facets in a shape model. The argument should be between 0 and 1. "
+ "For example, if a model has 100 facets and the argument to -decimate is 0.90, "
+ "there will be approximately 10 facets after the decimation.")
.build());
options.addOption(
Option.builder("input")
options.addOption(Option.builder("input")
.required()
.hasArg()
.desc(
@@ -122,68 +118,51 @@ public class ShapeFormatConverter implements TerrasaurTool {
+ "by commas to specify XYZ coordinates, or latitude, longitude in degrees separated by commas. "
+ "Transformed point will be written to stdout in the same format as the input string.")
.build());
options.addOption(
Option.builder("inputFormat")
options.addOption(Option.builder("inputFormat")
.hasArg()
.desc(
"Format of input file. If not present format will be inferred from inputFile extension.")
.desc("Format of input file. If not present format will be inferred from inputFile extension.")
.build());
options.addOption(
Option.builder("output")
options.addOption(Option.builder("output")
.hasArg()
.desc(
"Required for all but single point input. Name of transformed file. "
.desc("Required for all but single point input. Name of transformed file. "
+ "Extension must be obj, plt, sbmt, stl, sum, or vtk.")
.build());
options.addOption(
Option.builder("outputFormat")
options.addOption(Option.builder("outputFormat")
.hasArg()
.desc(
"Format of output file. If not present format will be inferred from outputFile extension.")
.desc("Format of output file. If not present format will be inferred from outputFile extension.")
.build());
options.addOption(
Option.builder("register")
options.addOption(Option.builder("register")
.hasArg()
.desc("Use SVD to transform input file to best align with register file.")
.build());
options.addOption(
Option.builder("rotate")
options.addOption(Option.builder("rotate")
.hasArg()
.desc(
"Rotate surface points and spacecraft position. "
.desc("Rotate surface points and spacecraft position. "
+ "Specify by an angle (degrees) and a 3 element rotation axis vector (XYZ) "
+ "separated by commas.")
.build());
options.addOption(
Option.builder("rotateToPrincipalAxes")
options.addOption(Option.builder("rotateToPrincipalAxes")
.desc("Rotate body to align along its principal axes of inertia.")
.build());
options.addOption(
Option.builder("scale")
options.addOption(Option.builder("scale")
.hasArg()
.desc(
"Scale the shape model by <arg>. This can either be one value or three "
.desc("Scale the shape model by <arg>. This can either be one value or three "
+ "separated by commas. One value scales all three axes uniformly, "
+ "three values scale the x, y, and z axes respectively. For example, "
+ "-scale 0.5,0.25,1.5 scales the model in the x dimension by 0.5, the "
+ "y dimension by 0.25, the z dimension by 1.5.")
.build());
options.addOption(
Option.builder("translate")
options.addOption(Option.builder("translate")
.hasArg()
.desc(
"Translate surface points and spacecraft position. "
.desc("Translate surface points and spacecraft position. "
+ "Specify by three floating point numbers separated by commas.")
.build());
options.addOption(
Option.builder("translateToCenter")
options.addOption(Option.builder("translateToCenter")
.desc("Translate body so that its center of mass is at the origin.")
.build());
options.addOption(
Option.builder("transform")
options.addOption(Option.builder("transform")
.hasArg()
.desc(
"Translate and rotate surface points and spacecraft position. "
.desc("Translate and rotate surface points and spacecraft position. "
+ "Specify a file containing a 4x4 combined translation/rotation matrix. The top left 3x3 matrix "
+ "is the rotation matrix. The top three entries in the right hand column are the translation "
+ "vector. The bottom row is always 0 0 0 1.")
@@ -215,8 +194,9 @@ public class ShapeFormatConverter implements TerrasaurTool {
String extension = null;
if (cl.hasOption("inputFormat")) {
try {
extension =
FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase()).name().toLowerCase();
extension = FORMATS.valueOf(cl.getOptionValue("inputFormat").toUpperCase())
.name()
.toLowerCase();
} catch (IllegalArgumentException e) {
logger.warn("Unsupported -inputFormat {}", cl.getOptionValue("inputFormat"));
}
@@ -248,8 +228,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
polydata = new vtkPolyData();
polydata.SetPoints(points);
if (params.length == 2) {
double[] array =
new Vector3D(
double[] array = new Vector3D(
Math.toRadians(Double.parseDouble(params[0].trim())),
Math.toRadians(Double.parseDouble(params[1].trim())))
.toArray();
@@ -261,8 +240,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
points.InsertNextPoint(array);
coordType = COORDTYPE.XYZ;
} else {
logger.error(
"Can't read input shape model {} with format {}", filename, extension.toUpperCase());
logger.error("Can't read input shape model {} with format {}", filename, extension.toUpperCase());
System.exit(0);
}
}
@@ -293,12 +271,10 @@ public class ShapeFormatConverter implements TerrasaurTool {
for (Option option : cl.getOptions()) {
if (option.getOpt().equals("centerOfRotation"))
centerOfRotation =
MathConversions.toVector3(
VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("centerOfRotation")));
if (option.getOpt().equals("rotate"))
rotation =
MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
rotation = MathConversions.toMatrix33(RotationUtils.stringToRotation(cl.getOptionValue("rotate")));
if (option.getOpt().equals("scale")) {
String scaleString = cl.getOptionValue("scale");
@@ -310,8 +286,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
}
if (option.getOpt().equals("translate"))
translation =
MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
translation = MathConversions.toVector3(VectorUtils.stringToVector3D(cl.getOptionValue("translate")));
if (option.getOpt().equals("transform")) {
List<String> lines =
@@ -359,7 +334,8 @@ public class ShapeFormatConverter implements TerrasaurTool {
vtkPoints points = polydata.GetPoints();
double[][] pointsA = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsA[i] = new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
pointsA[i] =
new Vector3D(points.GetPoint(i)).subtract(centerA).toArray();
points = registeredPolydata.GetPoints();
if (points.GetNumberOfPoints() != polydata.GetPoints().GetNumberOfPoints()) {
@@ -369,7 +345,8 @@ public class ShapeFormatConverter implements TerrasaurTool {
double[][] pointsB = new double[(int) points.GetNumberOfPoints()][3];
for (int i = 0; i < points.GetNumberOfPoints(); i++)
pointsB[i] = new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
pointsB[i] =
new Vector3D(points.GetPoint(i)).subtract(centerB).toArray();
double[][] H = new double[3][3];
for (int ii = 0; ii < points.GetNumberOfPoints(); ii++) {
@@ -400,8 +377,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
if (sumFile != null) {
if (rotation != null && translation != null)
sumFile.transform(
MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
sumFile.transform(MathConversions.toVector3D(translation), MathConversions.toRotation(rotation));
} else {
Vector3 center;
@@ -472,13 +448,16 @@ public class ShapeFormatConverter implements TerrasaurTool {
extension = null;
if (cl.hasOption("outputFormat")) {
try {
extension =
FORMATS.valueOf(cl.getOptionValue("outputFormat").toUpperCase()).name().toLowerCase();
extension = FORMATS.valueOf(
cl.getOptionValue("outputFormat").toUpperCase())
.name()
.toLowerCase();
} catch (IllegalArgumentException e) {
logger.warn("Unsupported -outputFormat {}", cl.getOptionValue("outputFormat"));
}
}
if (extension == null) extension = FilenameUtils.getExtension(filename).toLowerCase();
if (extension == null)
extension = FilenameUtils.getExtension(filename).toLowerCase();
switch (extension) {
case "vtk" -> PolyDataUtil.saveShapeModelAsVTK(polydata, filename);
@@ -517,10 +496,7 @@ public class ShapeFormatConverter implements TerrasaurTool {
}
}
default -> {
logger.error(
"Can't write output shape model {} with format {}",
filename,
extension.toUpperCase());
logger.error("Can't write output shape model {} with format {}", filename, extension.toUpperCase());
System.exit(0);
}
}

View File

@@ -95,10 +95,8 @@ public class SumFilesFromFlyby implements TerrasaurTool {
this.sumfile = sumfile;
// given phase angle p, closest approach point is (cos p, sin p)
Vector3D closestApproach =
new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
Vector3D velocity =
new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
Vector3D closestApproach = new Vector3D(FastMath.cos(phase), FastMath.sin(phase), 0.).scalarMultiply(distance);
Vector3D velocity = new Vector3D(-FastMath.sin(phase), FastMath.cos(phase), 0.).scalarMultiply(speed);
/*-
* Assumptions:
@@ -143,8 +141,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
return s;
}
private String writeMSOPCKFiles(
String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
private String writeMSOPCKFiles(String basename, IntervalSet intervals, int frameID, SpiceBundle bundle)
throws SpiceException {
File commentFile = new File(basename + "_msopck-comments.txt");
@@ -195,10 +192,8 @@ public class SumFilesFromFlyby implements TerrasaurTool {
double imageTime = t + t0;
Vector3D scPos = scPosFunc.apply(t);
SpiceQuaternion q =
new SpiceQuaternion(
MathConversions.toMatrix33(
RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
SpiceQuaternion q = new SpiceQuaternion(
MathConversions.toMatrix33(RotationUtils.KprimaryJsecondary(scPos.negate(), Vector3D.MINUS_K)));
attitudeMap.put(imageTime, q);
}
@@ -210,11 +205,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
Vector3 v = q.getVector();
pw.printf(
"%s %.14e %.14e %.14e %.14e\n",
new TDBTime(t).toUTCString("ISOC", 6),
q.getScalar(),
v.getElt(0),
v.getElt(1),
v.getElt(2));
new TDBTime(t).toUTCString("ISOC", 6), q.getScalar(), v.getElt(0), v.getElt(1), v.getElt(2));
}
} catch (IOException e) {
logger.error(e.getLocalizedMessage(), e);
@@ -230,8 +221,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
* @param bundle SPICE bundle
* @return command to run MKSPK
*/
private String writeMKSPKFiles(
String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
private String writeMKSPKFiles(String basename, IntervalSet intervals, int centerID, SpiceBundle bundle) {
String commentFile = basename + "_mkspk-comments.txt";
String setupFile = basename + "_mkspk.setup";
@@ -275,94 +265,72 @@ public class SumFilesFromFlyby implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(inputFile)) {
for (UnwritableInterval interval : intervals) {
for (double t = interval.getBegin();
t < interval.getEnd();
t += interval.getLength() / 100) {
for (double t = interval.getBegin(); t < interval.getEnd(); t += interval.getLength() / 100) {
Vector3D scPos = scPosFunc.apply(t);
Vector3D scVel = scVelFunc.apply(t);
pw.printf(
"%.16e %.16e %.16e %.16e %.16e %.16e %.16e\n",
t + t0,
scPos.getX(),
scPos.getY(),
scPos.getZ(),
scVel.getX(),
scVel.getY(),
scVel.getZ());
t + t0, scPos.getX(), scPos.getY(), scPos.getZ(), scVel.getX(), scVel.getY(), scVel.getZ());
}
}
} catch (FileNotFoundException e) {
logger.error(e.getLocalizedMessage(), e);
}
return String.format(
"mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
return String.format("mkspk -setup %s -input %s -output %s.bsp", setupFile, inputFile, basename);
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("distance")
options.addOption(Option.builder("distance")
.hasArg()
.required()
.desc("Required. Flyby distance from body center in km.")
.build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(
Option.builder("mk")
options.addOption(Option.builder("mk")
.hasArg()
.desc(
"Path to NAIF metakernel. This should contain LSK, FK for the central body, and FK for the spacecraft. This is used by -mkspk and -msopck.")
.build());
options.addOption(
Option.builder("mkspk")
options.addOption(Option.builder("mkspk")
.hasArg()
.desc(
"If present, create input files for MKSPK. The argument is the NAIF id for the central body (e.g. 10 for the Sun). This option requires -lsk.")
.build());
options.addOption(
Option.builder("msopck")
options.addOption(Option.builder("msopck")
.desc("If present, create input files for MSOPCK. This option requires -lsk.")
.build());
options.addOption(
Option.builder("phase")
options.addOption(Option.builder("phase")
.hasArg()
.required()
.desc("Required. Phase angle at closest approach in degrees.")
.build());
options.addOption(
Option.builder("speed")
options.addOption(Option.builder("speed")
.hasArg()
.required()
.desc("Required. Flyby speed at closest approach in km/s.")
.build());
options.addOption(
Option.builder("template")
options.addOption(Option.builder("template")
.hasArg()
.required()
.desc(
"Required. An existing sumfile to use as a template. Camera parameters are taken from this "
.desc("Required. An existing sumfile to use as a template. Camera parameters are taken from this "
+ "file, while camera position and orientation are calculated.")
.build());
options.addOption(
Option.builder("times")
options.addOption(Option.builder("times")
.hasArgs()
.desc(
"If present, list of times separated by white space. In seconds, relative to closest approach.")
.desc("If present, list of times separated by white space. In seconds, relative to closest approach.")
.build());
return options;
}
@@ -388,8 +356,7 @@ public class SumFilesFromFlyby implements TerrasaurTool {
String base = FilenameUtils.getBaseName(sumFileTemplate);
String ext = FilenameUtils.getExtension(sumFileTemplate);
SumFilesFromFlyby app =
new SumFilesFromFlyby(
SumFilesFromFlyby app = new SumFilesFromFlyby(
SumFile.fromFile(new File(sumFileTemplate)),
Double.parseDouble(cl.getOptionValue("distance")),
Math.toRadians(phase),
@@ -403,22 +370,19 @@ public class SumFilesFromFlyby implements TerrasaurTool {
SpiceBundle bundle = null;
if (cl.hasOption("mk")) {
NativeLibraryLoader.loadSpiceLibraries();
bundle =
new SpiceBundle.Builder()
bundle = new SpiceBundle.Builder()
.addMetakernels(Collections.singletonList(cl.getOptionValue("mk")))
.build();
KernelDatabase.load(cl.getOptionValue("mk"));
}
TimeConversion tc =
bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
TimeConversion tc = bundle == null ? TimeConversion.createUsingInternalConstants() : bundle.getTimeConversion();
for (double t : times) {
SumFile s = app.getSumFile(t);
try (PrintWriter pw =
new PrintWriter(
String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
new PrintWriter(String.format("%s_%d.%s", base, (int) tc.utcStringToTDB(s.utcString()), ext))) {
pw.println(s);

View File

@@ -101,37 +101,31 @@ public class TileLookup implements TerrasaurTool {
if (fullPath.trim().isEmpty()) fullPath = ".";
if (!fullPath.endsWith(File.separator)) fullPath += File.separator;
return String.format(
"%s%s.%d.%s",
fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
"%s%s.%d.%s", fullPath, FilenameUtils.getBaseName(dbName), tile, FilenameUtils.getExtension(dbName));
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("nTiles")
options.addOption(Option.builder("nTiles")
.hasArg()
.required()
.desc("Number of points covering the sphere.")
.build());
options.addOption(
Option.builder("printCoords")
options.addOption(Option.builder("printCoords")
.desc(
"Print a table of points along with their coordinates. Takes precedence over -printStats, -printDistance, and -png.")
.build());
options.addOption(
Option.builder("printDistance")
options.addOption(Option.builder("printDistance")
.hasArg()
.desc(
"Print a table of points sorted by distance from the input point. "
+ "Format of the input point is longitude,latitude in degrees, comma separated without spaces. Takes precedence over -png.")
.build());
options.addOption(
Option.builder("printStats")
options.addOption(Option.builder("printStats")
.desc(
"Print statistics on the distances (in degrees) between each point and its nearest neighbor. Takes precedence over -printDistance and -png.")
.build());
options.addOption(
Option.builder("png")
options.addOption(Option.builder("png")
.hasArg()
.desc(
"Plot points and distance to nearest point in degrees. Argument is the name of the PNG file to write.")
@@ -168,8 +162,7 @@ public class TileLookup implements TerrasaurTool {
}
if (cl.hasOption("printStats")) {
System.out.println(
"Statistics on distances between each point and its nearest neighbor (degrees):");
System.out.println("Statistics on distances between each point and its nearest neighbor (degrees):");
System.out.println(fs.getDistanceStats());
System.exit(0);
}
@@ -196,30 +189,28 @@ public class TileLookup implements TerrasaurTool {
}
if (cl.hasOption("png")) {
PlotConfig config = ImmutablePlotConfig.builder().width(1000).height(1000).build();
PlotConfig config =
ImmutablePlotConfig.builder().width(1000).height(1000).build();
String title = String.format("Fibonacci Sphere, n = %d, ", npts);
Map<String, Projection> projections = new LinkedHashMap<>();
projections.put(
title + "Rectangular Projection",
new ProjectionRectangular(config.width(), config.height() / 2));
title + "Rectangular Projection", new ProjectionRectangular(config.width(), config.height() / 2));
projections.put(
title + "Orthographic Projection (0, 90)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, Math.PI / 2, 0)));
projections.put(
title + "Orthographic Projection (0, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, 0)));
projections.put(
title + "Orthographic Projection (90, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI / 2)));
projections.put(
title + "Orthographic Projection (180, 0)",
new ProjectionOrthographic(
config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
new ProjectionOrthographic(config.width(), config.height(), new LatitudinalVector(1, 0, Math.PI)));
projections.put(
title + "Orthographic Projection (270, 0)",
new ProjectionOrthographic(
@@ -235,7 +226,11 @@ public class TileLookup implements TerrasaurTool {
colors.add(Color.BLACK);
for (int i = 1; i < nColors; i++) colors.add(ramp.getColor(i));
colors.add(Color.WHITE);
ramp = ImmutableColorRamp.builder().min(0).max(nColors).colors(colors).build();
ramp = ImmutableColorRamp.builder()
.min(0)
.max(nColors)
.colors(colors)
.build();
double radius = fs.getDistanceStats().getMax();
ramp = ColorRamp.createLinear(0, radius).addLimitColors();
@@ -246,8 +241,15 @@ public class TileLookup implements TerrasaurTool {
Projection p = projections.get(t);
if (p instanceof ProjectionRectangular)
config = ImmutablePlotConfig.builder().from(config).height(500).build();
else config = ImmutablePlotConfig.builder().from(config).height(1000).build();
config = ImmutablePlotConfig.builder()
.from(config)
.height(500)
.build();
else
config = ImmutablePlotConfig.builder()
.from(config)
.height(1000)
.build();
MapPlot canvas = new MapPlot(config, p);
AxisX xLowerAxis = new AxisX(0, 360, "Longitude (degrees)", "%.0fE");
@@ -262,7 +264,8 @@ public class TileLookup implements TerrasaurTool {
for (int j = 0; j < config.height(); j++) {
LatitudinalVector lv = p.pixelToSpherical(i, j);
if (lv == null) continue;
double closestDistance = Math.toDegrees(fs.getNearest(lv).getKey());
double closestDistance =
Math.toDegrees(fs.getNearest(lv).getKey());
// int numPoints = fs.getDistanceMap(lv).subMap(0., Math.toRadians(radius)).size();
image.setRGB(
config.leftMargin() + i,
@@ -278,11 +281,8 @@ public class TileLookup implements TerrasaurTool {
}
if (p instanceof ProjectionRectangular) {
canvas.drawColorBar(
ImmutableColorBar.builder()
.rect(
new Rectangle(
canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
canvas.drawColorBar(ImmutableColorBar.builder()
.rect(new Rectangle(canvas.getPageWidth() - 60, config.topMargin(), 10, config.height()))
.ramp(ramp)
.numTicks(nColors + 1)
.tickFunction(aDouble -> String.format("%.1f", aDouble))
@@ -297,7 +297,9 @@ public class TileLookup implements TerrasaurTool {
for (int i = 0; i < fs.getNumTiles(); i++) {
LatitudinalVector lv = fs.getTileCenter(i);
canvas.addAnnotation(
ImmutableAnnotation.builder().text(String.format("%d", i)).build(),
ImmutableAnnotation.builder()
.text(String.format("%d", i))
.build(),
lv.getLongitude(),
lv.getLatitude());
}
@@ -310,8 +312,7 @@ public class TileLookup implements TerrasaurTool {
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
Graphics2D g = image.createGraphics();
g.setRenderingHint(
RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
g.setRenderingHint(RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_BILINEAR);
int imageWidth = width;
int imageHeight = height / 3;
g.drawImage(

View File

@@ -52,7 +52,6 @@ import terrasaur.utils.SPICEUtil;
public class TransformFrame implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
@Override
public String shortDescription() {
return "Transform coordinates between reference frames.";
@@ -93,38 +92,52 @@ public class TransformFrame implements TerrasaurTool {
try (PrintWriter pw = new PrintWriter(outFile)) {
for (TDBTime t : pointsOut.keySet()) {
Vector3 v = pointsOut.get(t);
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1),
v.getElt(2));
pw.printf("%.6f,%.6e,%.6e,%.6e\n", t.getTDBSeconds(), v.getElt(0), v.getElt(1), v.getElt(2));
}
} catch (FileNotFoundException | SpiceException e) {
logger.error(e.getLocalizedMessage());
}
}
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("inFile").required().hasArg()
options.addOption(Option.builder("inFile")
.required()
.hasArg()
.desc("Required. Text file containing comma separated t, x, y, z values. Time is ET.")
.build());
options.addOption(Option.builder("inFrame").required().hasArg()
.desc("Required. Name of inFile reference frame.").build());
options.addOption(Option.builder("outFile").required().hasArg()
.desc("Required. Name of output file. It will be in the same format as inFile.").build());
options.addOption(Option.builder("outFrame").required().hasArg()
.desc("Required. Name of outFile reference frame.").build());
options.addOption(Option.builder("spice").required().hasArg().desc(
"Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
options.addOption(Option.builder("inFrame")
.required()
.hasArg()
.desc("Required. Name of inFile reference frame.")
.build());
options.addOption(Option.builder("outFile")
.required()
.hasArg()
.desc("Required. Name of output file. It will be in the same format as inFile.")
.build());
options.addOption(Option.builder("outFrame")
.required()
.hasArg()
.desc("Required. Name of outFile reference frame.")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("Required. Name of SPICE metakernel containing kernels needed to make the frame transformation.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build()); return options;
.build());
return options;
}
public static void main(String[] args) {
@@ -145,14 +158,15 @@ public class TransformFrame implements TerrasaurTool {
List<String> lines = FileUtils.readLines(f, Charset.defaultCharset());
for (String line : lines) {
String trim = line.trim();
if (trim.isEmpty() || trim.startsWith("#"))
continue;
if (trim.isEmpty() || trim.startsWith("#")) continue;
String[] parts = trim.split(",");
double et = Double.parseDouble(parts[0].trim());
if (et > 0) {
TDBTime t = new TDBTime(et);
Vector3 v = new Vector3(Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()), Double.parseDouble(parts[3].trim()));
Vector3 v = new Vector3(
Double.parseDouble(parts[1].trim()),
Double.parseDouble(parts[2].trim()),
Double.parseDouble(parts[3].trim()));
map.put(t, v);
}
}
@@ -172,5 +186,4 @@ public class TransformFrame implements TerrasaurTool {
tf.transformCoordinates(cl.getOptionValue("inFrame"), cl.getOptionValue("outFrame"));
tf.write(cl.getOptionValue("outFile"));
}
}

View File

@@ -61,11 +61,14 @@ public class TranslateTime implements TerrasaurTool {
String header = "";
String footer = "\nConvert between different time systems.\n";
return TerrasaurTool.super.fullDescription(options, header, footer);
}
private enum Types {
JULIAN, SCLK, TDB, TDBCALENDAR, UTC
JULIAN,
SCLK,
TDB,
TDBCALENDAR,
UTC
}
private Map<Integer, SCLK> sclkMap;
@@ -129,29 +132,39 @@ public class TranslateTime implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("logFile").hasArg()
.desc("If present, save screen output to log file.").build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values())
sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel").hasArg()
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(Option.builder("logLevel")
.hasArg()
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim() + ". Default is INFO.")
.build());
options.addOption(Option.builder("sclk").hasArg().desc(
"SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
options.addOption(Option.builder("sclk")
.hasArg()
.desc("SPICE id of the sclk to use. Default is to use the first one found in the kernel pool.")
.build());
options.addOption(Option.builder("spice")
.required()
.hasArg()
.desc("Required. SPICE metakernel containing leap second and SCLK.")
.build());
options.addOption(Option.builder("spice").required().hasArg()
.desc("Required. SPICE metakernel containing leap second and SCLK.").build());
options.addOption(Option.builder("gui").desc("Launch a GUI.").build());
options.addOption(Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
options.addOption(
Option.builder("inputDate").hasArgs().desc("Date to translate.").build());
sb = new StringBuilder();
for (Types system : Types.values()) {
sb.append(String.format("%s ", system.name()));
}
options.addOption(Option.builder("inputSystem").hasArg().desc(
"Timesystem of inputDate. Valid values are " + sb.toString().trim() + ". Default is UTC.")
.build()); return options;
options.addOption(Option.builder("inputSystem")
.hasArg()
.desc("Timesystem of inputDate. Valid values are "
+ sb.toString().trim() + ". Default is UTC.")
.build());
return options;
}
public static void main(String[] args) throws SpiceException {
@@ -166,13 +179,11 @@ public class TranslateTime implements TerrasaurTool {
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
// This is to avoid java crashing due to inability to connect to an X display
if (!cl.hasOption("gui"))
System.setProperty("java.awt.headless", "true");
if (!cl.hasOption("gui")) System.setProperty("java.awt.headless", "true");
NativeLibraryLoader.loadSpiceLibraries();
for (String kernel : cl.getOptionValues("spice"))
KernelDatabase.load(kernel);
for (String kernel : cl.getOptionValues("spice")) KernelDatabase.load(kernel);
LinkedHashMap<Integer, SCLK> sclkMap = new LinkedHashMap<>();
String[] sclk_data_type = KernelPool.getNames("SCLK_DATA_*");
@@ -185,13 +196,11 @@ public class TranslateTime implements TerrasaurTool {
SCLK sclk = null;
if (cl.hasOption("sclk")) {
int sclkID = Integer.parseInt(cl.getOptionValue("sclk"));
if (sclkMap.containsKey(sclkID))
sclk = sclkMap.get(sclkID);
if (sclkMap.containsKey(sclkID)) sclk = sclkMap.get(sclkID);
else {
logger.error("Cannot find SCLK {} in kernel pool!", sclkID);
StringBuilder sb = new StringBuilder();
for (Integer id : sclkMap.keySet())
sb.append(String.format("%d ", id));
for (Integer id : sclkMap.keySet()) sb.append(String.format("%d ", id));
logger.error("Loaded IDs are {}", sb.toString());
}
} else {
@@ -221,12 +230,11 @@ public class TranslateTime implements TerrasaurTool {
}
StringBuilder sb = new StringBuilder();
for (String s : cl.getOptionValues("inputDate"))
sb.append(String.format("%s ", s));
for (String s : cl.getOptionValues("inputDate")) sb.append(String.format("%s ", s));
String inputDate = sb.toString().trim();
Types type =
cl.hasOption("inputSystem") ? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
Types type = cl.hasOption("inputSystem")
? Types.valueOf(cl.getOptionValue("inputSystem").toUpperCase())
: Types.UTC;
switch (type) {
@@ -248,15 +256,19 @@ public class TranslateTime implements TerrasaurTool {
}
System.out.printf("# input date %s (%s)\n", inputDate, type.name());
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n",
sclk.getIDCode());
System.out.printf("# UTC, TDB (Calendar), DOY, TDB, Julian Date, SCLK (%d)\n", sclk.getIDCode());
String utcString = tt.toTDB().toUTCString("ISOC", 3);
String tdbString = tt.toTDB().toString("YYYY-MM-DDTHR:MN:SC.### ::TDB");
String doyString = tt.toTDB().toString("DOY");
System.out.printf("%s, %s, %s, %.6f, %s, %s\n", utcString, tdbString, doyString,
tt.toTDB().getTDBSeconds(), tt.toJulian(), tt.toSCLK().toString());
System.out.printf(
"%s, %s, %s, %.6f, %s, %s\n",
utcString,
tdbString,
doyString,
tt.toTDB().getTDBSeconds(),
tt.toJulian(),
tt.toSCLK().toString());
}
}

View File

@@ -0,0 +1,159 @@
/*
* The MIT License
* Copyright © 2025 Johns Hopkins University Applied Physics Laboratory
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
package terrasaur.apps;
import java.io.*;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.io.FilenameUtils;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import terrasaur.templates.TerrasaurTool;
import terrasaur.utils.ICQUtils;
import terrasaur.utils.NativeLibraryLoader;
import terrasaur.utils.PolyDataUtil;
import vtk.vtkPolyData;
public class TriAx implements TerrasaurTool {
private static final Logger logger = LogManager.getLogger();
private TriAx() {}
@Override
public String shortDescription() {
return "Generate a triaxial ellipsoid in ICQ format.";
}
@Override
public String fullDescription(Options options) {
String footer =
"\nTriAx is an implementation of the SPC tool TRIAX, which generates a triaxial ellipsoid in ICQ format.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(Option.builder("A")
.required()
.hasArg()
.desc("Long axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("B")
.required()
.hasArg()
.desc("Medium axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("C")
.required()
.hasArg()
.desc("Short axis of the ellipsoid, arbitrary units (usually assumed to be km).")
.build());
options.addOption(Option.builder("Q")
.required()
.hasArg()
.desc("ICQ size parameter. This is conventionally but not necessarily a power of 2.")
.build());
options.addOption(Option.builder("saveOBJ")
.desc("If present, save in OBJ format as well. "
+ "File will have the same name as ICQ file with an OBJ extension.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.required()
.desc("Name of ICQ file to write.")
.build());
return options;
}
static final int MAX_Q = 512;
public static void main(String[] args) {
TerrasaurTool defaultOBJ = new TriAx();
Options options = defineOptions();
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
int q = Integer.parseInt(cl.getOptionValue("Q"));
String shapefile = cl.getOptionValue("output");
double[] ax = new double[3];
ax[0] = Double.parseDouble(cl.getOptionValue("A"));
ax[1] = Double.parseDouble(cl.getOptionValue("B"));
ax[2] = Double.parseDouble(cl.getOptionValue("C"));
double[][][][] vec = new double[3][MAX_Q + 1][MAX_Q + 1][6];
for (int f = 0; f < 6; f++) {
for (int i = 0; i <= q; i++) {
for (int j = 0; j <= q; j++) {
double[] u = ICQUtils.xyf2u(q, i, j, f, ax);
double z = 1
/ Math.sqrt(
Math.pow(u[0] / ax[0], 2) + Math.pow(u[1] / ax[1], 2) + Math.pow(u[2] / ax[2], 2));
double[] v = new Vector3D(u).scalarMultiply(z).toArray();
for (int k = 0; k < 3; k++) {
vec[k][i][j][f] = v[k];
}
}
}
}
ICQUtils.writeICQ(q, vec, shapefile);
if (cl.hasOption("saveOBJ")) {
String basename = FilenameUtils.getBaseName(shapefile);
String dirname = FilenameUtils.getFullPath(shapefile);
if (dirname.isEmpty()) dirname = ".";
File obj = new File(dirname, basename + ".obj");
NativeLibraryLoader.loadVtkLibraries();
try {
vtkPolyData polydata = PolyDataUtil.loadShapeModel(shapefile);
if (polydata == null) {
logger.error("Cannot read {}", shapefile);
System.exit(0);
}
polydata = PolyDataUtil.removeDuplicatePoints(polydata);
polydata = PolyDataUtil.removeUnreferencedPoints(polydata);
polydata = PolyDataUtil.removeZeroAreaFacets(polydata);
PolyDataUtil.saveShapeModelAsOBJ(polydata, obj.getPath());
} catch (Exception e) {
logger.error(e);
}
}
}
}

View File

@@ -30,7 +30,6 @@ import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
@@ -58,30 +57,33 @@ public class ValidateNormals implements TerrasaurTool {
@Override
public String fullDescription(Options options) {
String footer =
"\nThis program checks that the normals of the shape model are not pointing inward.\n";
String footer = "\nThis program checks that the normals of the shape model are not pointing inward.\n";
return TerrasaurTool.super.fullDescription(options, "", footer);
}
static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("origin")
options.addOption(Option.builder("fast")
.desc("If present, only check for overhangs if center and normal point in opposite "
+ "directions. Default behavior is to always check for intersections between body center "
+ "and facet center.")
.build());
options.addOption(Option.builder("origin")
.hasArg()
.desc(
"If present, center of body in xyz coordinates. "
.desc("If present, center of body in xyz coordinates. "
+ "Specify as three floating point values separated by commas. Default is to use the centroid of "
+ "the input shape model.")
.build());
options.addOption(
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build());
options.addOption(
Option.builder("output")
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("output")
.hasArg()
.desc("Write out new OBJ file with corrected vertex orders for facets.")
.build());
options.addOption(
Option.builder("numThreads")
options.addOption(Option.builder("numThreads")
.hasArg()
.desc("Number of threads to run. Default is 1.")
.build());
@@ -119,17 +121,18 @@ public class ValidateNormals implements TerrasaurTool {
private class FlippedNormalFinder implements Callable<List<Long>> {
private static final DateTimeFormatter defaultFormatter =
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
private static final DateTimeFormatter defaultFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss z")
.withLocale(Locale.getDefault())
.withZone(ZoneId.systemDefault());
private final long index0;
private final long index1;
private final boolean fast;
public FlippedNormalFinder(long index0, long index1) {
public FlippedNormalFinder(long index0, long index1, boolean fast) {
this.index0 = index0;
this.index1 = index1;
this.fast = fast;
}
@Override
@@ -149,8 +152,7 @@ public class ValidateNormals implements TerrasaurTool {
long elapsed = Instant.now().getEpochSecond() - startTime;
long estimatedFinish = (long) (elapsed / (pctDone / 100) + startTime);
String finish = defaultFormatter.format(Instant.ofEpochSecond(estimatedFinish));
logger.info(
String.format(
logger.info(String.format(
"Thread %d: read %d of %d facets. %.0f%% complete, projected finish %s",
Thread.currentThread().threadId(), index0 + i, index1, pctDone, finish));
}
@@ -158,20 +160,21 @@ public class ValidateNormals implements TerrasaurTool {
long index = index0 + i;
CellInfo ci = CellInfo.getCellInfo(polyData, index, idList);
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
// count up all crossings of the surface between the origin and the facet.
int numCrossings = 0;
if (isOpposite || !fast) {
// count up all crossings of the surface between the origin and the facet.
getOBBTree().IntersectWithLine(origin, ci.center().toArray(), null, cellIds);
for (int j = 0; j < cellIds.GetNumberOfIds(); j++) {
if (cellIds.GetId(j) == index) break;
numCrossings++;
}
}
// if numCrossings is even, the radial and normal should point in the same direction. If it
// is odd, the
// radial and normal should point in opposite directions.
// is odd, the radial and normal should point in opposite directions.
boolean shouldBeOpposite = (numCrossings % 2 == 1);
boolean isOpposite = (ci.center().dotProduct(ci.normal()) < 0);
// XOR operator - true if both conditions are different
if (isOpposite ^ shouldBeOpposite) flippedNormals.add(index);
@@ -207,8 +210,7 @@ public class ValidateNormals implements TerrasaurTool {
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
@@ -234,8 +236,8 @@ public class ValidateNormals implements TerrasaurTool {
Set<Long> flippedNormals = new HashSet<>();
int numThreads =
cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
boolean fast = cl.hasOption("fast");
int numThreads = cl.hasOption("numThreads") ? Integer.parseInt(cl.getOptionValue("numThreads")) : 1;
try (ExecutorService executor = Executors.newFixedThreadPool(numThreads)) {
List<Future<List<Long>>> futures = new ArrayList<>();
@@ -244,7 +246,7 @@ public class ValidateNormals implements TerrasaurTool {
long fromIndex = i * numFacets;
long toIndex = Math.min(polyData.GetNumberOfCells(), fromIndex + numFacets);
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex);
FlippedNormalFinder fnf = app.new FlippedNormalFinder(fromIndex, toIndex, fast);
futures.add(executor.submit(fnf));
}
@@ -253,10 +255,7 @@ public class ValidateNormals implements TerrasaurTool {
executor.shutdown();
}
logger.info(
"Found {} flipped normals out of {} facets",
flippedNormals.size(),
polyData.GetNumberOfCells());
logger.info("Found {} flipped normals out of {} facets", flippedNormals.size(), polyData.GetNumberOfCells());
if (cl.hasOption("output")) {
NavigableSet<Long> sorted = new TreeSet<>(flippedNormals);

View File

@@ -114,11 +114,9 @@ public class ValidateOBJ implements TerrasaurTool {
int n = (int) (Math.log(facetCount() / 3.) / Math.log(4.0) + 0.5);
if (meetsCondition) {
validationMsg =
String.format(
"Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
String.format("Model has %d facets. This satisfies f = 3*4^n with n = %d.", facetCount(), n);
} else {
validationMsg =
String.format(
validationMsg = String.format(
"Model has %d facets. This does not satisfy f = 3*4^n. A shape model with %.0f facets has n = %d.",
facetCount(), 3 * Math.pow(4, n), n);
}
@@ -132,13 +130,10 @@ public class ValidateOBJ implements TerrasaurTool {
public boolean testVertices() {
boolean meetsCondition = (facetCount() + 4) / 2 == vertexCount();
if (meetsCondition)
validationMsg =
String.format(
"Model has %d vertices and %d facets. This satisfies v = f/2+2.",
vertexCount(), facetCount());
validationMsg = String.format(
"Model has %d vertices and %d facets. This satisfies v = f/2+2.", vertexCount(), facetCount());
else
validationMsg =
String.format(
validationMsg = String.format(
"Model has %d vertices and %d facets. This does not satisfy v = f/2+2. Number of vertices should be %d.",
vertexCount(), facetCount(), facetCount() / 2 + 2);
@@ -160,8 +155,7 @@ public class ValidateOBJ implements TerrasaurTool {
polyData.GetPoint(i, iPt);
List<Long> closestVertices = new ArrayList<>();
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol))
if (id > i) closestVertices.add(id);
for (Long id : sbm.findClosestVerticesWithinRadius(iPt, tol)) if (id > i) closestVertices.add(id);
if (!closestVertices.isEmpty()) map.put(i, closestVertices);
if (map.containsKey(i) && !map.get(i).isEmpty()) {
@@ -254,8 +248,7 @@ public class ValidateOBJ implements TerrasaurTool {
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
double area = facet.getArea();
if (area == 0) {
zeroAreaFacets.add(i);
@@ -305,17 +298,15 @@ public class ValidateOBJ implements TerrasaurTool {
// would be faster to check if id0==id1||id0==id2||id1==id2 but there may be
// duplicate vertices
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
if (facet.getArea() > 0) {
stats.addValue(facet.getCenter().getDot(facet.getNormal()));
stats.addValue(facet.getCenter().createUnitized().getDot(facet.getNormal()));
cStats.add(facet.getCenter());
nStats.add(facet.getNormal());
}
}
validationMsg =
String.format(
validationMsg = String.format(
"Using %d non-zero area facets: Mean angle between radial and normal is %f degrees, "
+ "angle between mean radial and mean normal is %f degrees",
stats.getN(),
@@ -327,40 +318,38 @@ public class ValidateOBJ implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("obj").required().hasArg().desc("Shape model to validate.").build());
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("obj")
.required()
.hasArg()
.desc("Shape model to validate.")
.build());
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
options.addOption(Option.builder("output").hasArg().desc("Filename for output OBJ.").build());
options.addOption(
Option.builder("removeDuplicateVertices")
options.addOption(Option.builder("output")
.hasArg()
.desc("Filename for output OBJ.")
.build());
options.addOption(Option.builder("removeDuplicateVertices")
.desc("Remove duplicate vertices. Use with -output to save OBJ.")
.build());
options.addOption(
Option.builder("removeUnreferencedVertices")
options.addOption(Option.builder("removeUnreferencedVertices")
.desc("Remove unreferenced vertices. Use with -output to save OBJ.")
.build());
options.addOption(
Option.builder("removeZeroAreaFacets")
options.addOption(Option.builder("removeZeroAreaFacets")
.desc("Remove facets with zero area. Use with -output to save OBJ.")
.build());
options.addOption(
Option.builder("cleanup")
.desc(
"Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
options.addOption(Option.builder("cleanup")
.desc("Combines -removeDuplicateVertices, -removeUnreferencedVertices, and -removeZeroAreaFacets.")
.build());
return options;
}
@@ -373,8 +362,7 @@ public class ValidateOBJ implements TerrasaurTool {
CommandLine cl = defaultOBJ.parseArgs(args, options);
Map<MessageLabel, String> startupMessages = defaultOBJ.startupMessages(cl);
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
for (MessageLabel ml : startupMessages.keySet()) logger.info("{} {}", ml.label, startupMessages.get(ml));
NativeLibraryLoader.loadVtkLibraries();
vtkPolyData polyData = PolyDataUtil.loadShapeModel(cl.getOptionValue("obj"));
@@ -406,8 +394,7 @@ public class ValidateOBJ implements TerrasaurTool {
final boolean cleanup = cl.hasOption("cleanup");
final boolean removeDuplicateVertices = cleanup || cl.hasOption("removeDuplicateVertices");
final boolean removeUnreferencedVertices =
cleanup || cl.hasOption("removeUnreferencedVertices");
final boolean removeUnreferencedVertices = cleanup || cl.hasOption("removeUnreferencedVertices");
final boolean removeZeroAreaFacets = cleanup || cl.hasOption("removeZeroAreaFacets");
if (removeDuplicateVertices) polyData = PolyDataUtil.removeDuplicatePoints(polyData);
@@ -418,7 +405,7 @@ public class ValidateOBJ implements TerrasaurTool {
if (cl.hasOption("output")) {
PolyDataUtil.saveShapeModelAsOBJ(polyData, cl.getOptionValue("output"));
logger.info(String.format("Wrote OBJ file %s", cl.getOptionValue("output")));
logger.info("Wrote OBJ file {}", cl.getOptionValue("output"));
}
}
}

View File

@@ -22,15 +22,16 @@
*/
package terrasaur.config;
import java.util.List;
import jackfruit.annotations.Comment;
import jackfruit.annotations.DefaultValue;
import jackfruit.annotations.Jackfruit;
import java.util.List;
@Jackfruit
public interface CKFromSumFileConfig {
@Comment("""
@Comment(
"""
Body fixed frame for the target body. If blank, use SPICE-defined
body fixed frame. This will be the reference frame unless the J2000
parameter is set to true.""")
@@ -41,14 +42,16 @@ public interface CKFromSumFileConfig {
@DefaultValue("DIMORPHOS")
String bodyName();
@Comment("""
@Comment(
"""
Extend CK past the last sumFile by this number of seconds. Default
is zero. Attitude is assumed to be fixed to the value given by the
last sumfile.""")
@DefaultValue("0")
double extend();
@Comment("""
@Comment(
"""
SPC defines the camera X axis to be increasing to the right, Y to
be increasing down, and Z to point into the page:
@@ -85,7 +88,8 @@ public interface CKFromSumFileConfig {
@DefaultValue("-3")
int flipZ();
@Comment("""
@Comment(
"""
Supply this frame kernel to MSOPCK. Only needed if the reference frame
(set by bodyFrame or J2000) is not built into SPICE""")
@DefaultValue("/project/dart/data/SPICE/flight/fk/didymos_system_001.tf")
@@ -111,11 +115,10 @@ public interface CKFromSumFileConfig {
@DefaultValue("DART_SPACECRAFT")
String spacecraftFrame();
@Comment("""
@Comment(
"""
SPICE metakernel to read. This may be specified more than once
for multiple metakernels.""")
@DefaultValue("/project/dart/data/SPICE/flight/mk/current.tm")
List<String> metakernel();
}

View File

@@ -29,8 +29,8 @@ import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.spi.StandardLevel;
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
import terrasaur.utils.Log4j2Configurator;
import terrasaur.utils.saaPlotLib.colorMaps.ColorRamp;
public class CommandLineOptions {
@@ -42,7 +42,11 @@ public class CommandLineOptions {
* @return
*/
public static Option addConfig() {
return Option.builder("config").hasArg().required().desc("Configuration file to load").build();
return Option.builder("config")
.hasArg()
.required()
.desc("Configuration file to load")
.build();
}
/**
@@ -56,8 +60,7 @@ public class CommandLineOptions {
for (ColorRamp.TYPE t : ColorRamp.TYPE.values()) sb.append(String.format("%s ", t.name()));
return Option.builder("colorRamp")
.hasArg()
.desc(
"Color ramp style. Valid values are "
.desc("Color ramp style. Valid values are "
+ sb.toString().trim()
+ ". Default is "
+ defaultCRType.name()
@@ -73,9 +76,9 @@ public class CommandLineOptions {
* @return
*/
public static ColorRamp.TYPE getColorRamp(CommandLine cl, ColorRamp.TYPE defaultCRType) {
ColorRamp.TYPE crType =
cl.hasOption("colorRamp")
? ColorRamp.TYPE.valueOf(cl.getOptionValue("colorRamp").toUpperCase().strip())
ColorRamp.TYPE crType = cl.hasOption("colorRamp")
? ColorRamp.TYPE.valueOf(
cl.getOptionValue("colorRamp").toUpperCase().strip())
: defaultCRType;
return crType;
}
@@ -150,8 +153,7 @@ public class CommandLineOptions {
for (StandardLevel l : StandardLevel.values()) sb.append(String.format("%s ", l.name()));
return Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build();
@@ -165,7 +167,8 @@ public class CommandLineOptions {
public static void setLogLevel(CommandLine cl) {
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel"))
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
lc.setLevel(
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
}
/**

View File

@@ -25,7 +25,6 @@ package terrasaur.config;
import jackfruit.annotations.Comment;
import jackfruit.annotations.DefaultValue;
import jackfruit.annotations.Jackfruit;
import java.util.List;
@Jackfruit(prefix = "mission")

View File

@@ -36,7 +36,9 @@ public interface SPCBlock {
###############################################################################
""";
@Comment(introLines + """
@Comment(
introLines
+ """
SPC defines the camera X axis to be increasing to the right, Y to
be increasing down, and Z to point into the page:
@@ -72,5 +74,4 @@ public interface SPCBlock {
@Comment("Map the camera Z axis to a SPICE axis. See flipX for details.")
@DefaultValue("-3")
int flipZ();
}

View File

@@ -92,14 +92,12 @@ public class TerrasaurConfig {
PropertiesConfiguration config = new ConfigBlockFactory().toConfig(instance.configBlock);
PropertiesConfigurationLayout layout = config.getLayout();
String now =
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
String now = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withLocale(Locale.getDefault())
.withZone(ZoneOffset.UTC)
.format(Instant.now());
layout.setHeaderComment(
String.format(
"Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
String.format("Configuration file for %s\nCreated %s UTC", AppVersion.getVersionString(), now));
config.write(pw);
} catch (ConfigurationException | IOException e) {
@@ -107,5 +105,4 @@ public class TerrasaurConfig {
}
return string.toString();
}
}

View File

@@ -92,10 +92,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
* DTM fits. Includes fits files at various intermediate stages such as the initial fits file from
* Maplet2Fits or CreateMapola, the fits files that contain gravity planes output from
@@ -126,7 +124,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// OBJ shape model as FITS file
@@ -153,7 +150,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// SIGMA. Used in calculation of some of the ALTWG products.
@@ -178,7 +174,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
VERTEX_ERROR("Sigma of vertex vector", "vrt", "km", "Sigma", null, "Vertex Radius") {
@@ -203,8 +198,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
@@ -238,7 +231,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return AltwgDataType.NORMALX_UNCERTAINTY;
}
},
/*
@@ -266,7 +258,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NORMALY_UNCERTAINTY;
}
},
/*
@@ -294,7 +285,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NORMALZ_UNCERTAINTY;
}
},
GRAVITY_VECTOR_X("gravity vector", "grv", "m/s^2", "Gravity vector X", null, null) {
@@ -317,7 +307,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return GRAVITY_VECTOR_X_UNCERTAINTY;
}
},
/*
@@ -345,7 +334,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return AltwgDataType.GRAVITY_VECTOR_Y_UNCERTAINTY;
}
},
/*
@@ -373,11 +361,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return GRAVITY_VECTOR_Z_UNCERTAINTY;
}
},
GRAVITATIONAL_MAGNITUDE("gravitational magnitude", "grm", "m/s^2", "Gravitational magnitude",
null, null) {
GRAVITATIONAL_MAGNITUDE("gravitational magnitude", "grm", "m/s^2", "Gravitational magnitude", null, null) {
@Override
public int getGlobalSpocID() {
return 10;
@@ -397,11 +383,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return AltwgDataType.GRAVITATIONAL_MAGNITUDE_UNCERTAINTY;
}
},
GRAVITATIONAL_POTENTIAL("gravitational potential", "pot", "J/kg", "Gravitational potential", null,
null) {
GRAVITATIONAL_POTENTIAL("gravitational potential", "pot", "J/kg", "Gravitational potential", null, null) {
@Override
public int getGlobalSpocID() {
return 11;
@@ -421,7 +405,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return AltwgDataType.GRAVITATIONAL_POTENTIAL_UNCERTAINTY;
}
},
ELEVATION("elevation", "elv", "meters", "Elevation", null, null) {
@@ -444,7 +427,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return EL_UNCERTAINTY;
}
},
// No longer needed! This is same as HEIGHT plane!
@@ -486,7 +468,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_UNCERTAINTY;
}
},
FACET_MAP("Facet Map", "ffi", null, "Facet Coarse Index", null, "Facet Coarse Index") {
@@ -510,7 +491,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
FACET_TILT("Facet Tilt", "fti", "degrees", "Facet tilt", null, "Facet Tilt") {
@@ -533,11 +513,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_UNCERTAINTY;
}
},
FACET_TILT_DIRECTION("Facet Tilt Direction", "fdi", "degrees", "Facet tilt direction", null,
"Facet Tilt Direction") {
FACET_TILT_DIRECTION(
"Facet Tilt Direction", "fdi", "degrees", "Facet tilt direction", null, "Facet Tilt Direction") {
@Override
public int getGlobalSpocID() {
return 35;
@@ -557,7 +536,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_UNCERTAINTY;
}
},
TILT_AVG("Mean Tilt", "mti", "degrees", "Mean tilt", null, "Mean Tilt") {
@@ -580,8 +558,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_AVG_UNCERTAINTY;
}
},
TILT_VARIATION("Tilt Variation", "tiv", "degrees", "Tilt variation", null, "Tilt Variation") {
@@ -604,11 +580,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_RELATIVE_UNCERTAINTY;
}
},
TILT_AVG_DIRECTION("Mean Tilt Direction", "mdi", "degrees", "Mean tilt direction", null,
"Mean Tilt Direction") {
TILT_AVG_DIRECTION("Mean Tilt Direction", "mdi", "degrees", "Mean tilt direction", null, "Mean Tilt Direction") {
@Override
public int getGlobalSpocID() {
return 37;
@@ -628,11 +602,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_AVG_DIRECTION_UNCERTAINTY;
}
},
TILT_DIRECTION_VARIATION("Tilt Direction Variation", "div", "degrees", "Tilt direction variation",
null, "Tilt Direction Variation") {
TILT_DIRECTION_VARIATION(
"Tilt Direction Variation",
"div",
"degrees",
"Tilt direction variation",
null,
"Tilt Direction Variation") {
@Override
public int getGlobalSpocID() {
return 39;
@@ -652,7 +630,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_RELATIVE_DIRECTION_UNCERTAINTY;
}
},
TILT_RELATIVE("Relative Tilt", "rti", "degrees", "Relative tilt", null, "Relative Tilt") {
@@ -675,11 +652,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_RELATIVE_UNCERTAINTY;
}
},
TILT_RELATIVE_DIRECTION("Relative Tilt Direction", "rdi", "degrees", "Relative tilt direction",
null, "Relative Tilt Direction") {
TILT_RELATIVE_DIRECTION(
"Relative Tilt Direction", "rdi", "degrees", "Relative tilt direction", null, "Relative Tilt Direction") {
@Override
public int getGlobalSpocID() {
return 43;
@@ -699,7 +675,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return TILT_RELATIVE_DIRECTION_UNCERTAINTY;
}
},
/*
@@ -711,8 +686,8 @@ public enum AltwgDataType {
* getAnciColName() returns the 'description' string
*/
RELATIVE_HEIGHT("Max height/depth within an ellipse", "mht", "km", "Max relative height", null,
"Max Relative Height") {
RELATIVE_HEIGHT(
"Max height/depth within an ellipse", "mht", "km", "Max relative height", null, "Max Relative Height") {
@Override
public int getGlobalSpocID() {
return 49;
@@ -732,7 +707,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return RELATIVE_HEIGHT_UNCERTAINTY;
}
},
FACETRAD("Facet Radius", "rad", "km", "Facet radius", null, "Facet_Center_Radius") {
@@ -755,7 +729,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return FACETRAD_UNCERTAINTY;
}
},
/*
@@ -763,8 +736,7 @@ public enum AltwgDataType {
* fileFrag, String units, String headerValue, String comment, String anciColName)
*/
TPLATEANCI("Ancillary Template Scalar", "tes", "TBDCOLUNITS", "AncillaryTemplate", null,
"TBDCOLNAME") {
TPLATEANCI("Ancillary Template Scalar", "tes", "TBDCOLUNITS", "AncillaryTemplate", null, "TBDCOLNAME") {
@Override
public int getGlobalSpocID() {
return -1;
@@ -785,12 +757,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// ancillary template, scalar
TPLATEANCIVEC("Ancillary Template Vector", "tev", "TBD", "AncillaryTemplate", null,
"TBDCOLNAME") {
TPLATEANCIVEC("Ancillary Template Vector", "tev", "TBD", "AncillaryTemplate", null, "TBDCOLNAME") {
@Override
public int getGlobalSpocID() {
return -1;
@@ -811,7 +781,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
AREA("Facet Area", "are", "km^2", "Facet Area", null, "Facet Area") {
@@ -835,7 +804,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return AREA_UNCERTAINTY;
}
},
/*
@@ -846,7 +814,12 @@ public enum AltwgDataType {
* fits header anciColName - string to use when defining the column name in the ancillary fits
* file. If null then getAnciColName() returns the 'description' string
*/
AREA_UNCERTAINTY("FacetArea Uncertainty", "notapplicable", "km^2", "facet area uncertainty", null,
AREA_UNCERTAINTY(
"FacetArea Uncertainty",
"notapplicable",
"km^2",
"facet area uncertainty",
null,
"facet area uncertainty") {
@Override
@@ -868,12 +841,9 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
EL_UNCERTAINTY("Elevation Uncertainty", "notapplicable", "m", "El uncertainty", null,
"elevation uncertainty") {
EL_UNCERTAINTY("Elevation Uncertainty", "notapplicable", "m", "El uncertainty", null, "elevation uncertainty") {
@Override
public int getGlobalSpocID() {
@@ -894,7 +864,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
@@ -907,8 +876,8 @@ public enum AltwgDataType {
*/
// Uncertainty in gravity vector, x component
GRAVITY_VECTOR_X_UNCERTAINTY("Grav_X Uncertainty", "notapplicable", "m/s^2", "GravX uncertainty",
null, "vector component uncertainty") {
GRAVITY_VECTOR_X_UNCERTAINTY(
"Grav_X Uncertainty", "notapplicable", "m/s^2", "GravX uncertainty", null, "vector component uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -932,12 +901,11 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in gravity vector, y component
GRAVITY_VECTOR_Y_UNCERTAINTY("Grav_Y Uncertainty", "notapplicable", "m/s^2", "GravY uncertainty",
null, "vector component uncertainty") {
GRAVITY_VECTOR_Y_UNCERTAINTY(
"Grav_Y Uncertainty", "notapplicable", "m/s^2", "GravY uncertainty", null, "vector component uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -961,12 +929,11 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in gravity vector, z component
GRAVITY_VECTOR_Z_UNCERTAINTY("Grav_Z Uncertainty", "notapplicable", "m/s^2", "GravZ uncertainty",
null, "vector component uncertainty") {
GRAVITY_VECTOR_Z_UNCERTAINTY(
"Grav_Z Uncertainty", "notapplicable", "m/s^2", "GravZ uncertainty", null, "vector component uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -990,12 +957,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in gravitational magnitude
GRAVITATIONAL_MAGNITUDE_UNCERTAINTY("Grav Mag Uncertainty", "notapplicable", "m/s^2",
"GravMag uncertainty", null, "grav magnitude uncertainty") {
GRAVITATIONAL_MAGNITUDE_UNCERTAINTY(
"Grav Mag Uncertainty",
"notapplicable",
"m/s^2",
"GravMag uncertainty",
null,
"grav magnitude uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1019,12 +990,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in gravitational potential
GRAVITATIONAL_POTENTIAL_UNCERTAINTY("Grav Pot Uncertainty", "notapplicable", "m/s^2",
"GravPot Uncertainty", null, "grav potential uncertainty") {
GRAVITATIONAL_POTENTIAL_UNCERTAINTY(
"Grav Pot Uncertainty",
"notapplicable",
"m/s^2",
"GravPot Uncertainty",
null,
"grav potential uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1048,12 +1023,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in Normal Vector, X component
NORMALX_UNCERTAINTY("Normal X Uncertainty", "notapplicable", null, "Normal X Uncertainty", null,
NORMALX_UNCERTAINTY(
"Normal X Uncertainty",
"notapplicable",
null,
"Normal X Uncertainty",
null,
"vector component uncertainty") {
@Override
@@ -1078,11 +1056,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in Normal Vector, Y component
NORMALY_UNCERTAINTY("Normal Y Uncertainty", "notapplicable", null, "Normal Y Uncertainty", null,
NORMALY_UNCERTAINTY(
"Normal Y Uncertainty",
"notapplicable",
null,
"Normal Y Uncertainty",
null,
"vector component uncertainty") {
@Override
@@ -1107,11 +1089,15 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// Uncertainty in Normal Vector, Z component
NORMALZ_UNCERTAINTY("Normal Z Uncertainty", "notapplicable", null, "Normal Z Uncertainty", null,
NORMALZ_UNCERTAINTY(
"Normal Z Uncertainty",
"notapplicable",
null,
"Normal Z Uncertainty",
null,
"vector component uncertainty") {
@Override
@@ -1136,7 +1122,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// tilt uncertainty
@@ -1147,8 +1132,7 @@ public enum AltwgDataType {
* anciColName - string to use when defining the column name in the ancillary fits file. If null
* then getAnciColName() returns the 'description' string
*/
TILT_UNCERTAINTY("Tilt Uncertainty", "notapplicable", "degrees", "Tilt Uncertainty", null,
"tilt uncertainty") {
TILT_UNCERTAINTY("Tilt Uncertainty", "notapplicable", "degrees", "Tilt Uncertainty", null, "tilt uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1172,15 +1156,19 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
* tilt variation standard error. Used to populate the SIGMA column for the tilt variation
* ancillary product.
*/
TILTVAR_UNCERTAINTY("Tilt Variation Uncertainty", "notapplicable", "degrees",
"Tilt Variation Uncertainty", null, "variation uncertainty") {
TILTVAR_UNCERTAINTY(
"Tilt Variation Uncertainty",
"notapplicable",
"degrees",
"Tilt Variation Uncertainty",
null,
"variation uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1204,15 +1192,19 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
* tilt variation standard error. Used to populate the SIGMA column for the tilt variation
* ancillary product.
*/
TILTDIRVAR_UNCERTAINTY("Tilt Direction Variation Uncertainty", "notapplicable", "degrees",
"Tilt Direction Variation Uncertainty", null, "direction variation uncertainty") {
TILTDIRVAR_UNCERTAINTY(
"Tilt Direction Variation Uncertainty",
"notapplicable",
"degrees",
"Tilt Direction Variation Uncertainty",
null,
"direction variation uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1236,10 +1228,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
* albedo from SPC description - general description of the product type fileFrag - string
* fragment to use in ALTWG file naming convention units - units of the data values headerValue -
@@ -1269,11 +1259,10 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return ALBEDO_UNCERTAINTY;
}
},
ALBEDO_UNCERTAINTY("Albedo Uncertainty", "notapplicable", "unitless", "Albedo uncertainty", null,
"albedo uncertainty") {
ALBEDO_UNCERTAINTY(
"Albedo Uncertainty", "notapplicable", "unitless", "Albedo uncertainty", null, "albedo uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1297,7 +1286,6 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// OLA intensity. Used in place of albedo
@@ -1323,10 +1311,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return INTENSITY_UNCERTAINTY;
}
},
// OLA intensity. Used in place of albedo
INTENSITY_UNCERTAINTY("Intensity", "alb", "unitless", "Intensity", null, "intensity") {
@@ -1350,10 +1336,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
* Each enumeration has the following properties in the order shown: description - general
* description of the product type fileFrag - string fragment to use in ALTWG file naming
@@ -1363,8 +1347,13 @@ public enum AltwgDataType {
* getAnciColName() returns the 'description' string
*/
RELATIVE_HEIGHT_UNCERTAINTY("Max relative height uncertainty", "notapplicable", "km",
"Max relative height uncertainty", null, "max relative height uncertainty") {
RELATIVE_HEIGHT_UNCERTAINTY(
"Max relative height uncertainty",
"notapplicable",
"km",
"Max relative height uncertainty",
null,
"max relative height uncertainty") {
@Override
public int getGlobalSpocID() {
return -1;
@@ -1384,12 +1373,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// facet radius uncertainty
FACETRAD_UNCERTAINTY("Facet Radius Uncertainty", "notapplicable", null,
"Facet Radius Uncertainty", null, "facet radius uncertainty") {
FACETRAD_UNCERTAINTY(
"Facet Radius Uncertainty",
"notapplicable",
null,
"Facet Radius Uncertainty",
null,
"facet radius uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1413,13 +1406,11 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// uncertainty in mean tilt (TILT_AVG)
TILT_AVG_UNCERTAINTY("Mean Tilt Uncertainty", "notapplicable", null, "Mean Tilt Uncertainty",
null, "mean tilt uncertainty") {
TILT_AVG_UNCERTAINTY(
"Mean Tilt Uncertainty", "notapplicable", null, "Mean Tilt Uncertainty", null, "mean tilt uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1443,13 +1434,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// uncertainty in mean tilt direction (TILT_AVG_DIRECTION)
TILT_AVG_DIRECTION_UNCERTAINTY("Mean Tilt Direction Uncertainty", "notapplicable", null,
"Mean Tilt Direction Uncertainty", null, "mean tilt direction uncertainty") {
TILT_AVG_DIRECTION_UNCERTAINTY(
"Mean Tilt Direction Uncertainty",
"notapplicable",
null,
"Mean Tilt Direction Uncertainty",
null,
"mean tilt direction uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1473,13 +1467,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// uncertainty in relative tilt (TILT_RELATIVE)
TILT_RELATIVE_UNCERTAINTY("Relative Tilt Uncertainty", "notapplicable", null,
"Relative Tilt Uncertainty", null, "relative tilt uncertainty") {
TILT_RELATIVE_UNCERTAINTY(
"Relative Tilt Uncertainty",
"notapplicable",
null,
"Relative Tilt Uncertainty",
null,
"relative tilt uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1503,42 +1500,16 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION)
TILT_RELATIVE_DIRECTION_UNCERTAINTY("Relative Tilt Direction Uncertainty", "notapplicable", null,
"Relative Tilt Direction Uncertainty", null, "relative tilt direction uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getGlobalSpocID() {
return -1;
}
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getLocalSpocID() {
return -1;
}
// no such plane in DTM
@Override
public PlaneInfo getPlaneInfo() {
return null;
}
@Override
public AltwgDataType getSigmaType() {
return NA;
}
},
// uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION)
TILT_DIRECTION_UNCERTAINTY("Tilt Direction Uncertainty", "notapplicable", null,
"Tilt Direction Uncertainty", null, "tilt direction uncertainty") {
TILT_RELATIVE_DIRECTION_UNCERTAINTY(
"Relative Tilt Direction Uncertainty",
"notapplicable",
null,
"Relative Tilt Direction Uncertainty",
null,
"relative tilt direction uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
@@ -1562,7 +1533,39 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
},
// uncertainty in relative tilt direction (TILT_RELATIVE_DIRECTION)
TILT_DIRECTION_UNCERTAINTY(
"Tilt Direction Uncertainty",
"notapplicable",
null,
"Tilt Direction Uncertainty",
null,
"tilt direction uncertainty") {
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getGlobalSpocID() {
return -1;
}
@Override
// not a SPOC product, just the SIGMA column in the ancillary fits file
public int getLocalSpocID() {
return -1;
}
// no such plane in DTM
@Override
public PlaneInfo getPlaneInfo() {
return null;
}
@Override
public AltwgDataType getSigmaType() {
return NA;
}
},
/*
@@ -1590,10 +1593,8 @@ public enum AltwgDataType {
public AltwgDataType getSigmaType() {
return NA;
}
};
private String description; // description of enumeration
private String fileFrag;
private String units;
@@ -1601,8 +1602,8 @@ public enum AltwgDataType {
private String comment; // comment to add to fits header (optional)
private String anciColName;
AltwgDataType(String description, String fileFrag, String units, String headerValue,
String comment, String anciColName) {
AltwgDataType(
String description, String fileFrag, String units, String headerValue, String comment, String anciColName) {
this.description = description;
this.fileFrag = fileFrag;
this.units = units;
@@ -1636,10 +1637,8 @@ public enum AltwgDataType {
}
public String getHeaderValueWithUnits() {
if (units == null)
return headerValue;
else
return headerValue + " (" + units + ")";
if (units == null) return headerValue;
else return headerValue + " (" + units + ")";
}
public String desc() {
@@ -1665,9 +1664,11 @@ public enum AltwgDataType {
* AltwgProductTypes.
*
*/
public static final EnumSet<AltwgDataType> anciGravitySet =
EnumSet.of(AltwgDataType.GRAVITY_VECTOR_X, AltwgDataType.GRAVITATIONAL_MAGNITUDE,
AltwgDataType.GRAVITATIONAL_POTENTIAL, AltwgDataType.ELEVATION);
public static final EnumSet<AltwgDataType> anciGravitySet = EnumSet.of(
AltwgDataType.GRAVITY_VECTOR_X,
AltwgDataType.GRAVITATIONAL_MAGNITUDE,
AltwgDataType.GRAVITATIONAL_POTENTIAL,
AltwgDataType.ELEVATION);
// slope is no longer in anciGravitySet, because it depends on errors that are determined
// in Shape2Tilt. See MultiTableToAncillaryFits.java
@@ -1695,17 +1696,22 @@ public enum AltwgDataType {
// EnumSet.range(AltwgDataType.NORMAL_VECTOR_X,
// AltwgDataType.SLOPE);
public static final EnumSet<AltwgDataType> gravityPlanes =
EnumSet.of(AltwgDataType.NORMAL_VECTOR_X, AltwgDataType.NORMAL_VECTOR_Y,
AltwgDataType.NORMAL_VECTOR_Z, AltwgDataType.GRAVITY_VECTOR_X,
AltwgDataType.GRAVITY_VECTOR_Y, AltwgDataType.GRAVITY_VECTOR_Z,
AltwgDataType.GRAVITATIONAL_MAGNITUDE, AltwgDataType.GRAVITATIONAL_POTENTIAL,
AltwgDataType.ELEVATION, AltwgDataType.SLOPE, AltwgDataType.AREA);
public static final EnumSet<AltwgDataType> gravityPlanes = EnumSet.of(
AltwgDataType.NORMAL_VECTOR_X,
AltwgDataType.NORMAL_VECTOR_Y,
AltwgDataType.NORMAL_VECTOR_Z,
AltwgDataType.GRAVITY_VECTOR_X,
AltwgDataType.GRAVITY_VECTOR_Y,
AltwgDataType.GRAVITY_VECTOR_Z,
AltwgDataType.GRAVITATIONAL_MAGNITUDE,
AltwgDataType.GRAVITATIONAL_POTENTIAL,
AltwgDataType.ELEVATION,
AltwgDataType.SLOPE,
AltwgDataType.AREA);
public static final EnumSet<AltwgDataType> tiltPlanes =
EnumSet.range(AltwgDataType.FACET_TILT, AltwgDataType.RELATIVE_HEIGHT);
/**
* Unique identifier for ALTWG global product in OSIRIS-REx SPOC system.
*

View File

@@ -25,9 +25,16 @@ package terrasaur.enums;
import org.apache.commons.io.FilenameUtils;
public enum FORMATS {
ASCII(true), BIN3(true), BIN4(true), BIN7(true), FITS(false), ICQ(false), OBJ(false), PLT(
false), PLY(false), VTK(false);
ASCII(true),
BIN3(true),
BIN4(true),
BIN7(true),
FITS(false),
ICQ(false),
OBJ(false),
PLT(false),
PLY(false),
VTK(false);
/** True if this format contains no facet information */
public boolean pointsOnly;
@@ -78,5 +85,4 @@ public enum FORMATS {
return null;
}
}

View File

@@ -30,15 +30,22 @@ package terrasaur.enums;
*
*/
public enum FitsHeaderType {
ANCIGLOBALGENERIC, ANCILOCALGENERIC, ANCIGLOBALALTWG, ANCIG_FACETRELATION_ALTWG, ANCILOCALALTWG, DTMGLOBALALTWG, DTMLOCALALTWG, DTMGLOBALGENERIC, DTMLOCALGENERIC, NFTMLN;
ANCIGLOBALGENERIC,
ANCILOCALGENERIC,
ANCIGLOBALALTWG,
ANCIG_FACETRELATION_ALTWG,
ANCILOCALALTWG,
DTMGLOBALALTWG,
DTMLOCALALTWG,
DTMGLOBALGENERIC,
DTMLOCALGENERIC,
NFTMLN;
public static boolean isGLobal(FitsHeaderType hdrType) {
boolean globalProduct;
switch (hdrType) {
case ANCIGLOBALALTWG:
case ANCIGLOBALGENERIC:
case DTMGLOBALALTWG:

View File

@@ -148,7 +148,6 @@ public enum PlaneInfo {
coreTilts.add(PlaneInfo.RELATIVE_HEIGHT);
return coreTilts;
}
/**
@@ -161,8 +160,7 @@ public enum PlaneInfo {
* @return
* @throws HeaderCardException
*/
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList)
throws HeaderCardException {
public static List<HeaderCard> planesToHeaderCard(List<PlaneInfo> planeList) throws HeaderCardException {
List<HeaderCard> planeHeaders = new ArrayList<HeaderCard>();
String plane = "PLANE";
int cc = 1;
@@ -172,7 +170,5 @@ public enum PlaneInfo {
cc++;
}
return planeHeaders;
}
}

View File

@@ -32,7 +32,6 @@ import com.google.common.base.Strings;
*
*/
public enum SigmaFileType {
SPCSIGMA {
@Override
@@ -118,7 +117,6 @@ public enum SigmaFileType {
public static SigmaFileType sigmaTypeFromSrcType(SrcProductType srcType) {
SigmaFileType sigmaType = SigmaFileType.NOMATCH;
switch (srcType) {
case SPC:
sigmaType = SigmaFileType.SPCSIGMA;
break;
@@ -130,7 +128,6 @@ public enum SigmaFileType {
default:
sigmaType = SigmaFileType.NOMATCH;
break;
}
return sigmaType;

View File

@@ -30,14 +30,12 @@ package terrasaur.enums;
*
*/
public enum SrcProductType {
SFM {
@Override
public String getAltwgFrag() {
return "sfm";
}
},
SPC {
@@ -68,7 +66,6 @@ public enum SrcProductType {
public String getAltwgFrag() {
return "tru";
}
},
UNKNOWN {
@@ -106,10 +103,8 @@ public enum SrcProductType {
public static SrcProductType fromAltwgFrag(String stringFrag) {
for (SrcProductType prodType : SrcProductType.values()) {
if (prodType.getAltwgFrag().equals(stringFrag))
return prodType;
if (prodType.getAltwgFrag().equals(stringFrag)) return prodType;
}
return UNKNOWN;
}
}

View File

@@ -35,7 +35,16 @@ public enum AltPipelnEnum {
// controls whether or not certain global products get created.
// REQUIRED TO BE IN CONFIGFILE:
DOGLOBALSHAPE, OBJTOFITS, ADDOBJCOMMENTS, GLOBALRES0, DUMBERVALUES, DOGRAVGLOBAL, DOGLOBALGRAV_ANCI, DOGLOBALTILT_ANCI, DOGLOBALMISC_ANCI, DOGLOBALTILT,
DOGLOBALSHAPE,
OBJTOFITS,
ADDOBJCOMMENTS,
GLOBALRES0,
DUMBERVALUES,
DOGRAVGLOBAL,
DOGLOBALGRAV_ANCI,
DOGLOBALTILT_ANCI,
DOGLOBALMISC_ANCI,
DOGLOBALTILT,
// controls number of slots per job to use when running global distributed gravity
// in grid engine mode. Does not apply if running in local parallel mode
@@ -49,17 +58,26 @@ public enum AltPipelnEnum {
DSKERNEL,
// enable/disable the creation of global and local DSK files
DOGLOBALDSK, DOLOCALDSK,
DOGLOBALDSK,
DOLOCALDSK,
// global tilt semi-major axis in km. Now a required variable
GTILTMAJAXIS,
// settings for every local product generated by the pipeline
USEOLDMAPLETS, DODTMFITSOBJ, DOGRAVLOCAL, GENLOCALGRAV, DOLOCALTILT, DOLOCAL_GRAVANCI, DOLOCAL_TILTANCI, DOLOCAL_MISCANCI,
USEOLDMAPLETS,
DODTMFITSOBJ,
DOGRAVLOCAL,
GENLOCALGRAV,
DOLOCALTILT,
DOLOCAL_GRAVANCI,
DOLOCAL_TILTANCI,
DOLOCAL_MISCANCI,
// controls whether to use average gravitational potential for global and local gravity
// calculations. =0 use minimum reference potential, =1 use average reference potential
AVGGRAVPOTGLOBAL, AVGGRAVPOTLOCAL,
AVGGRAVPOTGLOBAL,
AVGGRAVPOTLOCAL,
// controls RunBigMap.
// integrate slope to height required. Defaults to "n" if this enum does not exist in config file.
@@ -73,18 +91,27 @@ public enum AltPipelnEnum {
// controls the source data, product destination, naming convention,
// and process flow.
DATASOURCE, REPORTTYPE, INDATADIR, OUTPUTDIR, PDSNAMING, RENAMEGLOBALOBJ, USEBIGMAPS, DOREPORT,
DATASOURCE,
REPORTTYPE,
INDATADIR,
OUTPUTDIR,
PDSNAMING,
RENAMEGLOBALOBJ,
USEBIGMAPS,
DOREPORT,
// shape model density and rotation rate are now required variables. This way we can easily spot
// what we are using
// as defaults.
SMDENSITY, SMRRATE,
SMDENSITY,
SMRRATE,
// stores type of naming convention. Ex. AltProduct, AltNFTMLN, DartProduct.
NAMINGCONVENTION,
// set values that cannot be derived from data.
REPORTBASENAME, VERSION,
REPORTBASENAME,
VERSION,
// everything after this is not a required keyword
@@ -109,7 +136,8 @@ public enum AltPipelnEnum {
// (optional) controls whether or not STL files get generated. If these do not exist in the
// pipeConfig file then they will NOT
// get generated!
GLOBALSTL, LOCALSTL,
GLOBALSTL,
LOCALSTL,
// keywords for local products
//
@@ -120,11 +148,26 @@ public enum AltPipelnEnum {
// MAPFILE: contains pointer to map centers file (optional). Used by TileShapeModelWithBigmaps.
// defaults to auto-generated tiles if this is not specified.
// allow for pointers to different files for 30cm and 10cm map products.
DOLOCALMAP, MAPDIR, MAPSCALE, MAPHALFSZ, REPORT, MAPSmallerSZ, MAPFILE, ISTAGSITE, LTILTMAJAXIS, LTILTMINAXIS, LTILTPA, MAXSCALE,
DOLOCALMAP,
MAPDIR,
MAPSCALE,
MAPHALFSZ,
REPORT,
MAPSmallerSZ,
MAPFILE,
ISTAGSITE,
LTILTMAJAXIS,
LTILTMINAXIS,
LTILTPA,
MAXSCALE,
// settings for local tag sites. Note TAGSFILE is not optional.
// it contains the tagsite name and lat,lon of tag site tile center
TAGDIR, TAGSCALE, TAGHALFSZ, TAGSFILE, REPORTTAG,
TAGDIR,
TAGSCALE,
TAGHALFSZ,
TAGSFILE,
REPORTTAG,
// pointer to OLA database. only required if DATASOURCE is OLA
OLADBPATH,
@@ -152,12 +195,18 @@ public enum AltPipelnEnum {
* parameters. The pipeline will use default values for these enums if they are not defined in the
* pipeline configuration file.
*/
GALGORITHM, GRAVCONST, GTILTMINAXIS, GTILTPA, MASSUNCERT, VSEARCHRAD_PCTGSD, FSEARCHRAD_PCTGSD, PXPERDEG,
GALGORITHM,
GRAVCONST,
GTILTMINAXIS,
GTILTPA,
MASSUNCERT,
VSEARCHRAD_PCTGSD,
FSEARCHRAD_PCTGSD,
PXPERDEG,
// The following are options to subvert normal pipeline operations or to configure pipeline for
// other missions
// global objs are supplied at all resolutions as the starting point.
// This means we can skip ICQ2PLT, ICQDUMBER, and PLT2OBJ calls
OBJASINPUT,
@@ -191,7 +240,8 @@ public enum AltPipelnEnum {
OBJCOMHEADER,
// identifies whether there are poisson reconstruct data products to include in the webpage report
LOCALPOISSON, GLOBALPOISSON;
LOCALPOISSON,
GLOBALPOISSON;
/*
* The following enumerations are required to exist in the altwg pipeline config file.
@@ -215,8 +265,7 @@ public enum AltPipelnEnum {
public static AltPipelnEnum fromString(String textString) {
for (AltPipelnEnum anciType : AltPipelnEnum.values()) {
if (anciType.toString().equals(textString))
return anciType;
if (anciType.toString().equals(textString)) return anciType;
}
return null;
}
@@ -246,7 +295,6 @@ public enum AltPipelnEnum {
if (parsedVal > 0) {
returnFlag = true;
}
}
return returnFlag;
}
@@ -314,11 +362,9 @@ public enum AltPipelnEnum {
default:
return "NA";
}
}
return "NA";
};
}
;
}

View File

@@ -56,7 +56,6 @@ public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
@@ -75,8 +74,5 @@ public class AltwgAnciGlobal extends AnciTableFits implements AnciFitsHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
}

View File

@@ -57,7 +57,6 @@ public class AltwgAnciGlobalFacetRelation extends AnciTableFits implements AnciF
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**

View File

@@ -56,7 +56,6 @@ public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
headers.addAll(getSpecificInfo("product specific"));
return headers;
}
/**
@@ -75,8 +74,5 @@ public class AltwgAnciLocal extends AnciTableFits implements AnciFitsHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
}

View File

@@ -63,7 +63,6 @@ public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
@@ -83,10 +82,8 @@ public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
* Added GSDI - specific to OREX-SPOC.
*/
@@ -105,5 +102,4 @@ public class AltwgGlobalDTM extends DTMFits implements DTMHeader {
return headers;
}
}

View File

@@ -63,7 +63,6 @@ public class AltwgLocalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
@Override
@@ -129,7 +128,6 @@ public class AltwgLocalDTM extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
@@ -150,5 +148,4 @@ public class AltwgLocalDTM extends DTMFits implements DTMHeader {
return headers;
}
}

View File

@@ -23,12 +23,10 @@
package terrasaur.fits;
import java.util.List;
import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException;
public interface AnciFitsHeader {
public List<HeaderCard> createFitsHeader() throws HeaderCardException;
}

View File

@@ -60,7 +60,6 @@ public abstract class AnciTableFits {
headers.addAll(getSpatialInfo("summary spatial information"));
return headers;
}
/**
@@ -78,7 +77,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
@@ -99,7 +97,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
@@ -117,7 +114,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
@@ -141,7 +137,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
@@ -179,7 +174,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
public List<HeaderCard> getSpatialInfo(String comment) throws HeaderCardException {
@@ -285,7 +279,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
return headers;
}
public List<HeaderCard> getUY() throws HeaderCardException {
@@ -296,7 +289,6 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers;
}
public List<HeaderCard> getUZ() throws HeaderCardException {
@@ -307,7 +299,5 @@ public abstract class AnciTableFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
return headers;
}
}

View File

@@ -97,7 +97,6 @@ public abstract class DTMFits {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
return headers;
}
/**
@@ -118,7 +117,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
@@ -136,7 +134,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
@@ -160,7 +157,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.OBJ_FILE));
return headers;
}
public List<HeaderCard> getMapInfo(String comment) throws HeaderCardException {
@@ -197,7 +193,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
/**
@@ -238,8 +233,7 @@ public abstract class DTMFits {
* @return
* @throws HeaderCardException
*/
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList)
throws HeaderCardException {
public List<HeaderCard> getPlaneInfo(String comment, List<HeaderCard> planeList) throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
if (comment.length() > 0) {
@@ -248,8 +242,8 @@ public abstract class DTMFits {
headers.addAll(planeList);
if (!dataContained) {
String errMesg = "ERROR! Cannot return keywords describing the DTM cube without "
+ "having the actual data!";
String errMesg =
"ERROR! Cannot return keywords describing the DTM cube without " + "having the actual data!";
throw new RuntimeException(errMesg);
}
@@ -260,8 +254,8 @@ public abstract class DTMFits {
for (HeaderCard thisPlane : planeList) {
System.out.println(thisPlane.getKey() + ":" + thisPlane.getValue());
}
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube "
+ " has " + fitsData.getData().length + " planes";
String errMesg = "Error: plane List has " + planeList.size() + " planes but datacube " + " has "
+ fitsData.getData().length + " planes";
throw new RuntimeException(errMesg);
}
@@ -341,7 +335,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UX_Z));
return headers;
}
public List<HeaderCard> getUY() throws HeaderCardException {
@@ -352,7 +345,6 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UY_Z));
return headers;
}
public List<HeaderCard> getUZ() throws HeaderCardException {
@@ -363,11 +355,9 @@ public abstract class DTMFits {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.UZ_Z));
return headers;
}
public HeaderCard getEnd() throws HeaderCardException {
return new HeaderCard(HeaderTag.END.toString(), HeaderTag.END.value(), HeaderTag.END.comment());
}
}

View File

@@ -124,7 +124,6 @@ public class FitsData {
}
}
public static class FitsDataBuilder {
private final double[][][] data;
private double[] V = null;
@@ -189,7 +188,6 @@ public class FitsData {
default:
throw new RuntimeException();
}
return this;
}

View File

@@ -33,12 +33,12 @@ import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.TimeZone;
import org.apache.commons.io.FileUtils;
import nom.tam.fits.FitsException;
import nom.tam.fits.Header;
import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException;
import nom.tam.util.Cursor;
import org.apache.commons.io.FileUtils;
import picante.math.coords.CoordConverters;
import picante.math.coords.LatitudinalVector;
import picante.math.vectorspace.UnwritableVectorIJK;
@@ -83,9 +83,7 @@ public class FitsHdr {
* basic no-arg constructor. Note that this constructor does NOT populate the fitsKV map! If you
* want an initial builder with initialized map then call initHdrBuilder().
*/
public FitsHdrBuilder() {
}
public FitsHdrBuilder() {}
/**
* Return all HeaderCards in their current state
@@ -164,7 +162,6 @@ public class FitsHdr {
setVCbyHeaderTag(HeaderTag.LRCLAT, "N/A", HeaderTag.LRCLAT.comment());
setVCbyHeaderTag(HeaderTag.ULCLNG, "N/A", HeaderTag.ULCLNG.comment());
setVCbyHeaderTag(HeaderTag.ULCLAT, "N/A", HeaderTag.ULCLAT.comment());
}
return this;
@@ -179,10 +176,8 @@ public class FitsHdr {
*/
public FitsHdrBuilder setCenterLatLon(LatitudinalVector llr) throws HeaderCardException {
if (llr != null) {
setVCbyHeaderTag(HeaderTag.CLON, Math.toDegrees(llr.getLongitude()),
HeaderTag.CLON.comment());
setVCbyHeaderTag(HeaderTag.CLAT, Math.toDegrees(llr.getLatitude()),
HeaderTag.CLAT.comment());
setVCbyHeaderTag(HeaderTag.CLON, Math.toDegrees(llr.getLongitude()), HeaderTag.CLON.comment());
setVCbyHeaderTag(HeaderTag.CLAT, Math.toDegrees(llr.getLatitude()), HeaderTag.CLAT.comment());
}
return this;
}
@@ -247,7 +242,6 @@ public class FitsHdr {
String errMesg = "ERROR! Center Vector is null! It could be that center"
+ " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg);
}
return this;
}
@@ -289,7 +283,6 @@ public class FitsHdr {
String errMesg = "ERROR! UX Vector is null! It could be that UX"
+ " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg);
}
return this;
@@ -313,11 +306,9 @@ public class FitsHdr {
String errMesg = "ERROR! UY Vector is null! It could be that UY"
+ " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg);
}
return this;
}
/**
@@ -338,11 +329,9 @@ public class FitsHdr {
String errMesg = "ERROR! UZ Vector is null! It could be that UZ"
+ " vector information is already contained in FitsHdrBuilder!";
throw new RuntimeException(errMesg);
}
return this;
}
/**
@@ -382,8 +371,7 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, String value)
throws HeaderCardException {
public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, String value) throws HeaderCardException {
if (fitsKV.containsKey(hdrTag.toString())) {
fitsKV.get(hdrTag.toString()).setValue(value);
@@ -409,8 +397,7 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, double value)
throws HeaderCardException {
public FitsHdrBuilder setVbyHeaderTag(HeaderTag hdrTag, double value) throws HeaderCardException {
String hdrKey = hdrTag.toString();
if (fitsKV.containsKey(hdrKey)) {
@@ -453,10 +440,8 @@ public class FitsHdr {
}
HeaderCard newCard = new HeaderCard(hdrKey, value, comment);
fitsKV.put(hdrKey, newCard);
}
return this;
}
/**
@@ -477,7 +462,6 @@ public class FitsHdr {
throw new RuntimeException(errMesg);
}
return this;
}
/**
@@ -507,7 +491,6 @@ public class FitsHdr {
fitsKV.put(hdrKey, newCard);
}
return this;
}
/**
@@ -525,7 +508,6 @@ public class FitsHdr {
System.out.println(
"WARNING! header:" + hdrKey + " is new. Will add it to " + "fits header builder.");
}
}
fitsKV.put(hdrKey, headerCard);
@@ -582,9 +564,9 @@ public class FitsHdr {
if (fitsData.hasUnitv()) {
// assume all unit vectors are present
this.setUx(fitsData.getUnit(UnitDir.UX)).setUy(fitsData.getUnit(UnitDir.UY))
this.setUx(fitsData.getUnit(UnitDir.UX))
.setUy(fitsData.getUnit(UnitDir.UY))
.setUz(fitsData.getUnit(UnitDir.UZ));
}
if (fitsData.hasGsd()) {
@@ -638,15 +620,13 @@ public class FitsHdr {
public void addCard(HeaderTag headerTag) throws HeaderCardException {
if (this.containsKey(headerTag)) {
System.out.println("WARNING: keyword:" + headerTag.toString()
+ " already exists in map. Will not" + " add a default headercard!");
System.out.println("WARNING: keyword:" + headerTag.toString() + " already exists in map. Will not"
+ " add a default headercard!");
} else {
HeaderCard newCard =
new HeaderCard(headerTag.toString(), headerTag.value(), headerTag.comment());
HeaderCard newCard = new HeaderCard(headerTag.toString(), headerTag.value(), headerTag.comment());
fitsKV.put(headerTag.toString(), newCard);
}
}
public FitsHdr build() {
@@ -661,8 +641,7 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public static FitsHdrBuilder checkGlobalLatLon(FitsHdrBuilder thisBuilder)
throws HeaderCardException {
public static FitsHdrBuilder checkGlobalLatLon(FitsHdrBuilder thisBuilder) throws HeaderCardException {
// check if this keyword exists first. If not, then add all the corner keywords!
if (!thisBuilder.containsKey(HeaderTag.LLCLNG)) {
@@ -687,17 +666,14 @@ public class FitsHdr {
thisBuilder.setVbyHeaderTag(HeaderTag.URCLAT, 90D);
thisBuilder.setVbyHeaderTag(HeaderTag.LRCLNG, 180D);
thisBuilder.setVbyHeaderTag(HeaderTag.LRCLAT, -90D);
}
// check if this keyword exists first. If not then add the center lat,lon keywords!
if (!thisBuilder.containsKey(HeaderTag.CLAT)) {
thisBuilder.addCard(HeaderTag.CLAT);
thisBuilder.addCard(HeaderTag.CLON);
}
if ((!validLatLon(thisBuilder, HeaderTag.CLAT))
|| (!validLatLon(thisBuilder, HeaderTag.CLON))) {
if ((!validLatLon(thisBuilder, HeaderTag.CLAT)) || (!validLatLon(thisBuilder, HeaderTag.CLON))) {
System.out.println("WARNING! check of global center keywords shows they"
+ "have not been set! Setting to default global values (0,0)");
LatitudinalVector llr = new LatitudinalVector(1., 0., 0.);
@@ -739,7 +715,6 @@ public class FitsHdr {
corner = HeaderTag.CLON;
throwBadLatLon(thisBuilder, corner);
}
private static boolean validLatLon(FitsHdrBuilder thisBuilder, HeaderTag cornerTag) {
@@ -755,13 +730,10 @@ public class FitsHdr {
if (!validLatLon(thisBuilder, cornerTag)) {
String chkVal = thisBuilder.getCard(cornerTag).getValue();
String errMesg =
("ERROR!" + cornerTag.toString() + " in FitsHdrBuilder not valid:" + chkVal);
String errMesg = ("ERROR!" + cornerTag.toString() + " in FitsHdrBuilder not valid:" + chkVal);
throw new RuntimeException(errMesg);
}
}
} // end static class FitsHdrBuilder
/**
@@ -787,7 +759,6 @@ public class FitsHdr {
+ fitsFile.toString() + " for fits header!";
System.err.println(errmesg);
System.exit(1);
}
return hdrBuilder;
}
@@ -803,8 +774,7 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public static FitsHdrBuilder copyFitsHeader(File fitsFile, boolean initHdr)
throws HeaderCardException {
public static FitsHdrBuilder copyFitsHeader(File fitsFile, boolean initHdr) throws HeaderCardException {
FitsHdrBuilder hdrBuilder = new FitsHdrBuilder();
try {
@@ -817,12 +787,10 @@ public class FitsHdr {
+ fitsFile.toString() + " for fits header!";
System.err.println(errmesg);
System.exit(1);
}
return hdrBuilder;
}
/**
* Loops through map of fits header cards and use it to populate and return FitsHdrBuilder. There
* are NO checks to see if the keywords match those in the toolkit!
@@ -857,8 +825,7 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public static FitsHdrBuilder copyFitsHeader(Map<String, HeaderCard> map)
throws HeaderCardException {
public static FitsHdrBuilder copyFitsHeader(Map<String, HeaderCard> map) throws HeaderCardException {
// initialize hdrBuilder with the full set of keywords specified in HeaderTag
boolean initHdr = false;
@@ -876,8 +843,8 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public static FitsHdrBuilder addToFitsHeader(Map<String, HeaderCard> map,
FitsHdrBuilder hdrBuilder) throws HeaderCardException {
public static FitsHdrBuilder addToFitsHeader(Map<String, HeaderCard> map, FitsHdrBuilder hdrBuilder)
throws HeaderCardException {
if (hdrBuilder == null) {
hdrBuilder = new FitsHdrBuilder();
@@ -888,7 +855,6 @@ public class FitsHdr {
hdrBuilder.setbyHeaderCard(thisCard);
}
return hdrBuilder;
}
/**
@@ -899,8 +865,7 @@ public class FitsHdr {
* @return
* @throws HeaderCardException
*/
public static FitsHdrBuilder addToFitsHeader(File fitsFile, FitsHdrBuilder hdrBuilder)
throws HeaderCardException {
public static FitsHdrBuilder addToFitsHeader(File fitsFile, FitsHdrBuilder hdrBuilder) throws HeaderCardException {
if (hdrBuilder == null) {
hdrBuilder = new FitsHdrBuilder();
@@ -917,7 +882,6 @@ public class FitsHdr {
+ fitsFile.toString() + " for fits header!";
System.err.println(errmesg);
System.exit(1);
}
return hdrBuilder;
}
@@ -931,8 +895,7 @@ public class FitsHdr {
public static FitsHdrBuilder initHdrBuilder() throws HeaderCardException {
FitsHdrBuilder builder = new FitsHdrBuilder();
for (HeaderTag tag : HeaderTag.fitsKeywords) {
builder.fitsKV.put(tag.toString(),
new HeaderCard(tag.toString(), tag.value(), tag.comment()));
builder.fitsKV.put(tag.toString(), new HeaderCard(tag.toString(), tag.value(), tag.comment()));
}
return builder;
}
@@ -957,8 +920,8 @@ public class FitsHdr {
}
if (hdrBuilder == null) {
System.out.println("builder passed to FitsHeader.configHdrBuilder() is null. Generating"
+ " new FitsHeaderBuilder");
System.out.println(
"builder passed to FitsHeader.configHdrBuilder() is null. Generating" + " new FitsHeaderBuilder");
hdrBuilder = new FitsHdrBuilder();
}
List<String> content = FileUtils.readLines(new File(configFile), Charset.defaultCharset());
@@ -987,8 +950,8 @@ public class FitsHdr {
// user explicitly wants to override any comment in this header with null
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], null);
} else {
System.out.println("setting " + thisTag.toString() + " to " + keyval[1]
+ ", comment to " + keyval[2]);
System.out.println(
"setting " + thisTag.toString() + " to " + keyval[1] + ", comment to " + keyval[2]);
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], keyval[2]);
}
} else {
@@ -997,9 +960,7 @@ public class FitsHdr {
System.out.println(line);
System.out.println("Cannot parse. skipping this line");
}
}
}
}
if (!separatorFound) {
@@ -1054,8 +1015,8 @@ public class FitsHdr {
return fitsKV.get(keyword);
} else {
System.out.println("WARNING! keyword:" + keyword + " not set in fits header builder."
+ " using default value.");
System.out.println(
"WARNING! keyword:" + keyword + " not set in fits header builder." + " using default value.");
return new HeaderCard(tag.toString(), tag.value(), tag.comment());
}
}
@@ -1075,18 +1036,16 @@ public class FitsHdr {
String keyword = tag.toString();
if (fitsKV.containsKey(keyword)) {
HeaderCard thisCard = fitsKV.get(keyword);
return new HeaderCard(thisCard.getKey(),
return new HeaderCard(
thisCard.getKey(),
Double.parseDouble(thisCard.getValue().replace('d', 'e').replace('D', 'e')),
thisCard.getComment());
} else {
System.out.println(
"WARNING! keyword:" + keyword + " not set in FitsHdr," + " using default value.");
System.out.println("WARNING! keyword:" + keyword + " not set in FitsHdr," + " using default value.");
return new HeaderCard(keyword, tag.value(), tag.comment());
}
}
/**
@@ -1101,15 +1060,12 @@ public class FitsHdr {
if (fitsKV.containsKey(keyword)) {
HeaderCard thisCard = fitsKV.get(keyword);
double value = Double.parseDouble(thisCard.getValue());
if (Double.isFinite(value))
value = Double.parseDouble(String.format(fmtS, value));
if (Double.isFinite(value)) value = Double.parseDouble(String.format(fmtS, value));
return new HeaderCard(thisCard.getKey(), value, thisCard.getComment());
} else {
String mesg =
"WARNING! keyword:" + tag.toString() + " not set in FitsHdr," + " using default value.";
String mesg = "WARNING! keyword:" + tag.toString() + " not set in FitsHdr," + " using default value.";
System.out.println(mesg);
return new HeaderCard(keyword, tag.value(), tag.comment());
}
}
@@ -1136,7 +1092,5 @@ public class FitsHdr {
}
return matchCard;
}
}

View File

@@ -28,10 +28,10 @@ import java.nio.charset.Charset;
import java.util.EnumMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.io.FileUtils;
import nom.tam.fits.FitsException;
import nom.tam.fits.HeaderCard;
import nom.tam.fits.HeaderCardException;
import org.apache.commons.io.FileUtils;
import terrasaur.utils.StringUtil;
/**
@@ -211,55 +211,33 @@ public class FitsHeader {
private FitsValCom nAxis2 = new FitsValCom("1024", null);
private FitsValCom nAxis3 = new FitsValCom("numberPlanesNotSet", null);
private FitsValCom hdrVers =
new FitsValCom(HeaderTag.HDRVERS.value(), HeaderTag.HDRVERS.comment());
private FitsValCom mission =
new FitsValCom(HeaderTag.MISSION.value(), HeaderTag.MISSION.comment());
private FitsValCom hdrVers = new FitsValCom(HeaderTag.HDRVERS.value(), HeaderTag.HDRVERS.comment());
private FitsValCom mission = new FitsValCom(HeaderTag.MISSION.value(), HeaderTag.MISSION.comment());
private FitsValCom hostName = new FitsValCom(HeaderTag.HOSTNAME.value(), null);
private FitsValCom target = new FitsValCom(HeaderTag.TARGET.value(), null);
private FitsValCom origin =
new FitsValCom(HeaderTag.ORIGIN.value(), HeaderTag.ORIGIN.comment());
private FitsValCom spocid =
new FitsValCom(HeaderTag.SPOC_ID.value(), HeaderTag.SPOC_ID.comment());
private FitsValCom sdparea =
new FitsValCom(HeaderTag.SDPAREA.value(), HeaderTag.SDPAREA.comment());
private FitsValCom sdpdesc =
new FitsValCom(HeaderTag.SDPDESC.value(), HeaderTag.SDPDESC.comment());
private FitsValCom missionPhase =
new FitsValCom(HeaderTag.MPHASE.value(), HeaderTag.MPHASE.comment());
private FitsValCom dataSource =
new FitsValCom(HeaderTag.DATASRC.value(), HeaderTag.DATASRC.comment());
private FitsValCom dataSourceFile =
new FitsValCom(HeaderTag.DATASRCF.value(), HeaderTag.DATASRCF.comment());
private FitsValCom dataSourceS =
new FitsValCom(HeaderTag.DATASRCS.value(), HeaderTag.DATASRCS.comment());
private FitsValCom dataSourceV =
new FitsValCom(HeaderTag.DATASRCV.value(), HeaderTag.DATASRCV.comment());
private FitsValCom software =
new FitsValCom(HeaderTag.SOFTWARE.value(), HeaderTag.SOFTWARE.comment());
private FitsValCom softver =
new FitsValCom(HeaderTag.SOFT_VER.value(), HeaderTag.SOFT_VER.comment());
private FitsValCom datasrcd =
new FitsValCom(HeaderTag.DATASRCD.value(), HeaderTag.DATASRCD.comment());
private FitsValCom productName =
new FitsValCom(HeaderTag.PRODNAME.value(), HeaderTag.PRODNAME.comment());
private FitsValCom dateprd =
new FitsValCom(HeaderTag.DATEPRD.value(), HeaderTag.DATEPRD.comment());
private FitsValCom origin = new FitsValCom(HeaderTag.ORIGIN.value(), HeaderTag.ORIGIN.comment());
private FitsValCom spocid = new FitsValCom(HeaderTag.SPOC_ID.value(), HeaderTag.SPOC_ID.comment());
private FitsValCom sdparea = new FitsValCom(HeaderTag.SDPAREA.value(), HeaderTag.SDPAREA.comment());
private FitsValCom sdpdesc = new FitsValCom(HeaderTag.SDPDESC.value(), HeaderTag.SDPDESC.comment());
private FitsValCom missionPhase = new FitsValCom(HeaderTag.MPHASE.value(), HeaderTag.MPHASE.comment());
private FitsValCom dataSource = new FitsValCom(HeaderTag.DATASRC.value(), HeaderTag.DATASRC.comment());
private FitsValCom dataSourceFile = new FitsValCom(HeaderTag.DATASRCF.value(), HeaderTag.DATASRCF.comment());
private FitsValCom dataSourceS = new FitsValCom(HeaderTag.DATASRCS.value(), HeaderTag.DATASRCS.comment());
private FitsValCom dataSourceV = new FitsValCom(HeaderTag.DATASRCV.value(), HeaderTag.DATASRCV.comment());
private FitsValCom software = new FitsValCom(HeaderTag.SOFTWARE.value(), HeaderTag.SOFTWARE.comment());
private FitsValCom softver = new FitsValCom(HeaderTag.SOFT_VER.value(), HeaderTag.SOFT_VER.comment());
private FitsValCom datasrcd = new FitsValCom(HeaderTag.DATASRCD.value(), HeaderTag.DATASRCD.comment());
private FitsValCom productName = new FitsValCom(HeaderTag.PRODNAME.value(), HeaderTag.PRODNAME.comment());
private FitsValCom dateprd = new FitsValCom(HeaderTag.DATEPRD.value(), HeaderTag.DATEPRD.comment());
private FitsValCom productType = new FitsValCom("productTypeNotSet", null);
private FitsValCom productVer =
new FitsValCom(HeaderTag.PRODVERS.value(), HeaderTag.PRODVERS.comment());
private FitsValCom mapVer =
new FitsValCom(HeaderTag.MAP_VER.value(), HeaderTag.MAP_VER.comment());
private FitsValCom productVer = new FitsValCom(HeaderTag.PRODVERS.value(), HeaderTag.PRODVERS.comment());
private FitsValCom mapVer = new FitsValCom(HeaderTag.MAP_VER.value(), HeaderTag.MAP_VER.comment());
private FitsValCom objFile =
new FitsValCom(HeaderTag.OBJ_FILE.value(), HeaderTag.OBJ_FILE.comment());
private FitsValCom author =
new FitsValCom(HeaderTag.CREATOR.value(), HeaderTag.CREATOR.comment());
private FitsValCom objFile = new FitsValCom(HeaderTag.OBJ_FILE.value(), HeaderTag.OBJ_FILE.comment());
private FitsValCom author = new FitsValCom(HeaderTag.CREATOR.value(), HeaderTag.CREATOR.comment());
private FitsValCom mapName =
new FitsValCom(HeaderTag.MAP_NAME.value(), HeaderTag.MAP_NAME.comment());
private FitsValCom mapType =
new FitsValCom(HeaderTag.MAP_TYPE.value(), HeaderTag.MAP_TYPE.comment());
private FitsValCom mapName = new FitsValCom(HeaderTag.MAP_NAME.value(), HeaderTag.MAP_NAME.comment());
private FitsValCom mapType = new FitsValCom(HeaderTag.MAP_TYPE.value(), HeaderTag.MAP_TYPE.comment());
private FitsValCom clon = new FitsValCom("-999", HeaderTag.CLON.comment());
private FitsValCom clat = new FitsValCom("-999", HeaderTag.CLAT.comment());
private FitsValCom minlon = new FitsValCom("-999", HeaderTag.MINLON.comment());
@@ -293,8 +271,7 @@ public class FitsHeader {
private FitsValCom sigDef = new FitsValCom(HeaderTag.SIGMA.value(), HeaderTag.SIGMA.comment());
private FitsValCom dqual1 = new FitsValCom("-999", HeaderTag.DQUAL_1.comment());
private FitsValCom dqual2 = new FitsValCom("-999", HeaderTag.DQUAL_2.comment());
private FitsValCom dsigDef =
new FitsValCom(HeaderTag.DSIG_DEF.value(), HeaderTag.DSIG_DEF.comment());
private FitsValCom dsigDef = new FitsValCom(HeaderTag.DSIG_DEF.value(), HeaderTag.DSIG_DEF.comment());
private FitsValCom density = new FitsValCom("-999", HeaderTag.DENSITY.comment());
private FitsValCom rotRate = new FitsValCom("-999", HeaderTag.ROT_RATE.comment());
private FitsValCom refPot = new FitsValCom("-999", HeaderTag.REF_POT.comment());
@@ -303,8 +280,7 @@ public class FitsHeader {
private FitsValCom tiltMin = new FitsValCom("-999", HeaderTag.TILT_MIN.comment());
private FitsValCom tiltPa = new FitsValCom("0", HeaderTag.TILT_PA.comment());
private EnumMap<HeaderTag, FitsValCom> tag2valcom =
new EnumMap<HeaderTag, FitsValCom>(HeaderTag.class);
private EnumMap<HeaderTag, FitsValCom> tag2valcom = new EnumMap<HeaderTag, FitsValCom>(HeaderTag.class);
public FitsHeaderBuilder() {
@@ -380,7 +356,6 @@ public class FitsHeader {
tag2valcom.put(HeaderTag.MAP_NAME, mapName);
tag2valcom.put(HeaderTag.MAP_TYPE, mapType);
tag2valcom.put(HeaderTag.MAP_VER, mapVer);
}
public FitsHeaderBuilder setTarget(String val, String comment) {
@@ -448,7 +423,8 @@ public class FitsHeader {
hdrTag = HeaderTag.valueOf(headerCard.getKey());
setVCbyHeaderTag(hdrTag, headerCard.getValue(), headerCard.getComment());
} catch (IllegalArgumentException e) {
if ((headerCard.getKey().contains("COMMENT")) || (headerCard.getKey().contains("PLANE"))) {
if ((headerCard.getKey().contains("COMMENT"))
|| (headerCard.getKey().contains("PLANE"))) {
} else {
System.out.println(headerCard.getKey() + " not a HeaderTag");
}
@@ -457,14 +433,12 @@ public class FitsHeader {
}
return this;
}
public FitsHeader build() {
return new FitsHeader(this);
}
}
/**
@@ -506,7 +480,6 @@ public class FitsHeader {
+ fitsFile.toString() + " for fits header!";
System.err.println(errmesg);
System.exit(1);
}
return hdrBuilder;
}
@@ -533,8 +506,8 @@ public class FitsHeader {
}
if (hdrBuilder == null) {
System.out.println("builder passed to FitsHeader.configHdrBuilder() is null. Generating"
+ " new FitsHeaderBuilder");
System.out.println(
"builder passed to FitsHeader.configHdrBuilder() is null. Generating" + " new FitsHeaderBuilder");
hdrBuilder = new FitsHeaderBuilder();
}
List<String> content = FileUtils.readLines(new File(configFile), Charset.defaultCharset());
@@ -560,8 +533,8 @@ public class FitsHeader {
// user explicitly wants to override any comment in this header with null
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], null);
} else {
System.out.println("setting " + thisTag.toString() + " to " + keyval[1]
+ ", comment to " + keyval[2]);
System.out.println(
"setting " + thisTag.toString() + " to " + keyval[1] + ", comment to " + keyval[2]);
hdrBuilder.setVCbyHeaderTag(thisTag, keyval[1], keyval[2]);
}
} else {
@@ -570,7 +543,6 @@ public class FitsHeader {
System.out.println(line);
System.out.println("Cannot parse. skipping this line");
}
}
}
}
@@ -609,7 +581,6 @@ public class FitsHeader {
int returnType = 0;
switch (tag) {
case PXPERDEG:
case CLON:
case CLAT:
@@ -665,15 +636,11 @@ public class FitsHeader {
return new HeaderCard(tag.toString(), valcom.getV(), valcom.getC());
case 1:
return new HeaderCard(tag.toString(), StringUtil.str2fmtD(fmtS, valcom.getV()),
valcom.getC());
return new HeaderCard(tag.toString(), StringUtil.str2fmtD(fmtS, valcom.getV()), valcom.getC());
case 2:
return new HeaderCard(tag.toString(), StringUtil.parseSafeD(valcom.getV()),
valcom.getC());
return new HeaderCard(tag.toString(), StringUtil.parseSafeD(valcom.getV()), valcom.getC());
}
}
String errMesg = "ERROR!, cannot find fits keyword:" + tag.toString();
@@ -702,5 +669,4 @@ public class FitsHeader {
}
return hdrBuilder;
}
}

View File

@@ -42,11 +42,9 @@ public class FitsHeaderFactory {
private static final String PLANE = "PLANE";
private static final String COMMENT = "COMMENT";
public static DTMHeader getDTMHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
switch (headerType) {
case NFTMLN:
return new NFTmln(fitsHdr);
@@ -65,7 +63,6 @@ public class FitsHeaderFactory {
default:
return null;
}
}
public static AnciFitsHeader getAnciHeader(FitsHdr fitsHdr, FitsHeaderType headerType) {
@@ -89,10 +86,8 @@ public class FitsHeaderFactory {
default:
return null;
}
}
/**
* Fits Header block that contains information about the fits header itself. Ex. Header version
* number.
@@ -108,7 +103,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.HDRVERS));
return headers;
}
/**
@@ -129,7 +123,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.ORIGIN));
return headers;
}
/**
@@ -151,7 +144,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.MPHASE));
return headers;
}
/**
@@ -184,7 +176,6 @@ public class FitsHeaderFactory {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
public static List<HeaderCard> getMapSpecificInfo(FitsHeader fitsHdr) throws HeaderCardException {
@@ -203,8 +194,8 @@ public class FitsHeaderFactory {
// GSDI*
}
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr,
FitsHeaderType fitsHeaderType) throws HeaderCardException {
public static List<HeaderCard> getSummarySpatialInfo(FitsHeader fitsHdr, FitsHeaderType fitsHeaderType)
throws HeaderCardException {
List<HeaderCard> headers = new ArrayList<HeaderCard>();
@@ -230,6 +221,5 @@ public class FitsHeaderFactory {
break;
}
return headers;
}
}

View File

@@ -28,7 +28,6 @@ import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import nom.tam.fits.BasicHDU;
import nom.tam.fits.Fits;
import nom.tam.fits.FitsException;
@@ -37,6 +36,7 @@ import nom.tam.fits.HeaderCard;
import nom.tam.fits.TableHDU;
import nom.tam.util.BufferedFile;
import nom.tam.util.Cursor;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
/**
* Utility class containing generic static routines for working with FITS files. Refer to AltwgFits
@@ -50,8 +50,7 @@ public class FitsUtil {
// private static final String PLANE = "PLANE";
// public static final String COMMENT = "COMMENT";
public static double[][][] loadFits(String filename, int[] axes)
throws FitsException, IOException {
public static double[][][] loadFits(String filename, int[] axes) throws FitsException, IOException {
Fits f = new Fits(filename);
BasicHDU hdu = f.getHDU(0);
int[] axes2 = hdu.getAxes();
@@ -87,7 +86,6 @@ public class FitsUtil {
int[] axes = hdu.getAxes();
f.close();
return axes;
}
/**
@@ -107,8 +105,7 @@ public class FitsUtil {
BasicHDU hdu = f.getHDU(0);
axes = hdu.getAxes();
if (axes.length != 3) {
throw new IOException(
"FITS file has incorrect dimensions! This was assumed to be 3D fits file!");
throw new IOException("FITS file has incorrect dimensions! This was assumed to be 3D fits file!");
}
// fits axes specifies image dimensions in this order: [numPlanes][iSize][jSize]
@@ -162,8 +159,7 @@ public class FitsUtil {
for (int ii = 0; ii < fitsData[1].length; ii++) {
for (int jj = 0; jj < fitsData[0][1].length; jj++) {
Vector3D fits1P =
new Vector3D(fitsData[4][ii][jj], fitsData[5][ii][jj], fitsData[6][ii][jj]);
Vector3D fits1P = new Vector3D(fitsData[4][ii][jj], fitsData[5][ii][jj], fitsData[6][ii][jj]);
// find angular separation between fits1 P vector and fits2 P vector
double radSep = Vector3D.angle(fits1P, xyzV);
@@ -194,7 +190,6 @@ public class FitsUtil {
public static boolean isGeneralHeaderKey(HeaderTag keyword) {
String name = keyword.name();
switch (name) {
case "SIMPLE":
case "BITPIX":
case "NAXIS":
@@ -229,10 +224,8 @@ public class FitsUtil {
// Add any tags passed in as the third argument of this function
if (newHeaderCards != null) {
for (HeaderCard hc : newHeaderCards) {
if (hc.getKey().toUpperCase().equals("COMMENT"))
hdu.getHeader().insertComment(hc.getComment());
else
hdu.getHeader().addLine(hc);
if (hc.getKey().toUpperCase().equals("COMMENT")) hdu.getHeader().insertComment(hc.getComment());
else hdu.getHeader().addLine(hc);
}
}
@@ -264,10 +257,8 @@ public class FitsUtil {
// Add any tags passed in as the third argument of this function
if (newHeaderCards != null) {
for (HeaderCard hc : newHeaderCards) {
if (hc.getKey().toUpperCase().equals("COMMENT"))
hdu.getHeader().insertComment(hc.getComment());
else
hdu.getHeader().addLine(hc);
if (hc.getKey().toUpperCase().equals("COMMENT")) hdu.getHeader().insertComment(hc.getComment());
else hdu.getHeader().addLine(hc);
}
}
@@ -279,7 +270,6 @@ public class FitsUtil {
f.close();
}
/**
* Assuming data consists of multiple 2D planes with the following ordering: Note that this
* particular ordering is created so that the data is written properly to the fits file.
@@ -359,8 +349,9 @@ public class FitsUtil {
double lon = indata[1][i][j];
double rad = indata[2][i][j];
double[] pt =
new Vector3D(Math.toRadians(lon), Math.toRadians(lat)).scalarMultiply(rad).toArray();
double[] pt = new Vector3D(Math.toRadians(lon), Math.toRadians(lat))
.scalarMultiply(rad)
.toArray();
outdata[0][i][j] = indata[0][i][j];
outdata[1][i][j] = indata[1][i][j];
@@ -413,8 +404,7 @@ public class FitsUtil {
int numPlanes = indata.length;
int numRows = indata[0].length;
int numCols = indata[0][0].length;
double[][][] outdata =
new double[numPlanes][numRows - 2 * cropAmount][numCols - 2 * cropAmount];
double[][][] outdata = new double[numPlanes][numRows - 2 * cropAmount][numCols - 2 * cropAmount];
for (int k = 0; k < numPlanes; ++k)
for (int i = 0; i < numRows - 2 * cropAmount; ++i)
for (int j = 0; j < numCols - 2 * cropAmount; ++j) {
@@ -460,8 +450,7 @@ public class FitsUtil {
* @param newHeaderCard
* @return
*/
public static List<HeaderCard> updateOrAppendCard(List<HeaderCard> headers,
HeaderCard newHeaderCard) {
public static List<HeaderCard> updateOrAppendCard(List<HeaderCard> headers, HeaderCard newHeaderCard) {
List<HeaderCard> newHeaders = new ArrayList<HeaderCard>();
@@ -481,10 +470,8 @@ public class FitsUtil {
}
return newHeaders;
}
/**
* Return the fits header as List&lt;HeaderCard&gt;. Each HeaderCard is equivalent to the FITS
* Keyword = Value, comment line in a fits header.
@@ -520,8 +507,8 @@ public class FitsUtil {
* @return
* @throws FitsException
*/
public static HeaderCard getCard(Map<String, HeaderCard> map, String searchKey,
boolean failOnNull) throws FitsException {
public static HeaderCard getCard(Map<String, HeaderCard> map, String searchKey, boolean failOnNull)
throws FitsException {
HeaderCard returnCard = null;
if (map.containsKey(searchKey)) {
returnCard = map.get(searchKey);
@@ -545,8 +532,7 @@ public class FitsUtil {
* @throws FitsException
* @throws IOException
*/
public static Map<String, HeaderCard> getFitsHeaderAsMap(String fitsFile)
throws FitsException, IOException {
public static Map<String, HeaderCard> getFitsHeaderAsMap(String fitsFile) throws FitsException, IOException {
Fits inf = new Fits(fitsFile);
BasicHDU inHdu = inf.getHDU(0);

View File

@@ -57,5 +57,4 @@ public class FitsValCom {
setV(newVal);
setC(newComment);
}
}

View File

@@ -44,7 +44,6 @@ public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
super(fitsHeader, FitsHeaderType.ANCIGLOBALGENERIC);
}
/**
* Build fits header portion that contains the spatial information of the Generic Anci Global
* product. Overrides the default implementation in AnciTableFits.
@@ -64,5 +63,4 @@ public class GenericAnciGlobal extends AnciTableFits implements AnciFitsHeader {
return headers;
}
}

View File

@@ -39,6 +39,4 @@ public class GenericAnciLocal extends AnciTableFits implements AnciFitsHeader {
public GenericAnciLocal(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.ANCILOCALGENERIC);
}
}

View File

@@ -37,5 +37,4 @@ public class GenericGlobalDTM extends DTMFits implements DTMHeader {
public GenericGlobalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMGLOBALGENERIC);
}
}

View File

@@ -37,5 +37,4 @@ public class GenericLocalDTM extends DTMFits implements DTMHeader {
public GenericLocalDTM(FitsHdr fitsHeader) {
super(fitsHeader, FitsHeaderType.DTMLOCALGENERIC);
}
}

View File

@@ -41,81 +41,109 @@ public enum HeaderTag {
// that are not null. Some of the default values obviously need to be updated when actual fits
// file
// is created.
SIMPLE("T", null), BITPIX(null, null), NAXIS("3", null), NAXIS1(null, null), NAXIS2(null, null),
SIMPLE("T", null),
BITPIX(null, null),
NAXIS("3", null),
NAXIS1(null, null),
NAXIS2(null, null),
NAXIS3(null, null), EXTEND("T", null), HDRVERS("1.0.0", null), NPRDVERS("1.0.0",
null), MISSION("OSIRIS-REx", "Name of mission"), HOSTNAME("OREX", "PDS ID"),
NAXIS3(null, null),
EXTEND("T", null),
HDRVERS("1.0.0", null),
NPRDVERS("1.0.0", null),
MISSION("OSIRIS-REx", "Name of mission"),
HOSTNAME("OREX", "PDS ID"),
TARGET("101955 BENNU", "Target object"), ORIGIN("OREXSPOC", null),
TARGET("101955 BENNU", "Target object"),
ORIGIN("OREXSPOC", null),
SPOC_ID("SPOCUPLOAD", null), SDPAREA("SPOCUPLOAD", null), SDPDESC("SPOCUPLOAD", null),
SPOC_ID("SPOCUPLOAD", null),
SDPAREA("SPOCUPLOAD", null),
SDPDESC("SPOCUPLOAD", null),
MPHASE("FillMeIn", "Mission phase."), DATASRC("FillMeIn",
"Shape model data source, i.e. 'SPC' or 'OLA'"),
MPHASE("FillMeIn", "Mission phase."),
DATASRC("FillMeIn", "Shape model data source, i.e. 'SPC' or 'OLA'"),
DATASRCF("FillMeIn", "Source shape model data filename"), DATASRCV("FillMeIn",
"Name and version of shape model"), DATASRCD("FillMeIn",
"Creation date of shape model in UTC"), DATASRCS("N/A",
"[m/pix] Shpe model plt scale"), SOFTWARE("FillMeIn",
"Software used to create map data"),
DATASRCF("FillMeIn", "Source shape model data filename"),
DATASRCV("FillMeIn", "Name and version of shape model"),
DATASRCD("FillMeIn", "Creation date of shape model in UTC"),
DATASRCS("N/A", "[m/pix] Shpe model plt scale"),
SOFTWARE("FillMeIn", "Software used to create map data"),
SOFT_VER("FillMeIn", "Version of software used to create map data"),
DATEPRD("1701-10-09", "Date this product was produced in UTC"),
DATENPRD("1701-10-09", "Date this NFT product was produced in UTC"),
PRODNAME("FillMeIn", "Product filename"),
PRODVERS("1.0.0", "Product version number"),
DATEPRD("1701-10-09", "Date this product was produced in UTC"), DATENPRD("1701-10-09",
"Date this NFT product was produced in UTC"), PRODNAME("FillMeIn",
"Product filename"), PRODVERS("1.0.0", "Product version number"),
MAP_VER("999", "Product version number."),
CREATOR("ALT-pipeline", "Name of software that created this product"),
AUTHOR("Espiritu", "Name of person that compiled this product"),
PROJECTN("Equirectangular", "Simple cylindrical projection"),
CLON("-999", "[deg] longitude at center of image"),
CLAT("-999", "[deg] latitude at center of image"),
MINLON(null, "[deg] minimum longitude of global DTM"),
MAXLON(null, "[deg] maximum longitude of global DTM"),
MINLAT(null, "[deg] minimum latitude of global DTM"),
MAXLAT(null, "[deg] maximum latitude of global DTM"),
MAP_VER("999", "Product version number."), CREATOR("ALT-pipeline",
"Name of software that created this product"), AUTHOR("Espiritu",
"Name of person that compiled this product"), PROJECTN("Equirectangular",
"Simple cylindrical projection"), CLON("-999",
"[deg] longitude at center of image"), CLAT("-999",
"[deg] latitude at center of image"), MINLON(null,
"[deg] minimum longitude of global DTM"), MAXLON(null,
"[deg] maximum longitude of global DTM"), MINLAT(null,
"[deg] minimum latitude of global DTM"), MAXLAT(null,
"[deg] maximum latitude of global DTM"),
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."),
LLCLAT("-999", "[deg]"),
LLCLNG("-999", "[deg]"),
ULCLAT("-999", "[deg]"),
ULCLNG("-999", "[deg]"),
PXPERDEG("-999", "[pixel per degree] grid spacing of global map."), LLCLAT("-999",
"[deg]"), LLCLNG("-999", "[deg]"), ULCLAT("-999", "[deg]"), ULCLNG("-999", "[deg]"),
URCLAT("-999", "[deg]"),
URCLNG("-999", "[deg]"),
LRCLAT("-999", "[deg]"),
LRCLNG("-999", "[deg]"),
URCLAT("-999", "[deg]"), URCLNG("-999", "[deg]"), LRCLAT("-999", "[deg]"), LRCLNG("-999",
"[deg]"),
CNTR_V_X("-999", "[km]"),
CNTR_V_Y("-999", "[km]"),
CNTR_V_Z("-999", "[km]"),
UX_X("-999", "[m]"),
UX_Y("-999", "[m]"),
CNTR_V_X("-999", "[km]"), CNTR_V_Y("-999", "[km]"), CNTR_V_Z("-999", "[km]"), UX_X("-999",
"[m]"), UX_Y("-999", "[m]"),
UX_Z("-999", "[m]"),
UY_X("-999", "[m]"),
UY_Y("-999", "[m]"),
UY_Z("-999", "[m]"),
UZ_X("-999", "[m]"),
UX_Z("-999", "[m]"), UY_X("-999", "[m]"), UY_Y("-999", "[m]"), UY_Z("-999", "[m]"), UZ_X("-999",
"[m]"),
UZ_Y("-999", "/[m]"),
UZ_Z("-999", "[m]"),
GSD("-999", "[mm] grid spacing in units/pixel"),
GSDI("-999", "[unk] Ground sample distance integer"),
SIGMA("-999", "Global uncertainty of the data [m]"),
UZ_Y("-999", "/[m]"), UZ_Z("-999", "[m]"), GSD("-999", "[mm] grid spacing in units/pixel"), GSDI(
"-999",
"[unk] Ground sample distance integer"), SIGMA("-999", "Global uncertainty of the data [m]"),
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"),
DQUAL_1("-999", "Data quality metric; incidence directions"),
SIG_DEF("Uncertainty", "SIGMA uncertainty metric"), DQUAL_1("-999",
"Data quality metric; incidence directions"),
DQUAL_2("-999", "Data quality metric; emission directions"), DSIG_DEF("UNK",
"Defines uncertainty metric in ancillary file"), END(null, null),
DQUAL_2("-999", "Data quality metric; emission directions"),
DSIG_DEF("UNK", "Defines uncertainty metric in ancillary file"),
END(null, null),
// additional fits tags describing gravity derived values
DENSITY("-999", "[kgm^-3] density of body"), ROT_RATE("-999",
"[rad/sec] rotation rate of body"), REF_POT("-999", "[J/kg] reference potential of body"),
DENSITY("-999", "[kgm^-3] density of body"),
ROT_RATE("-999", "[rad/sec] rotation rate of body"),
REF_POT("-999", "[J/kg] reference potential of body"),
// additional fits tags describing tilt derived values
TILT_RAD("-999", "[m]"), TILT_MAJ("-999",
"[m] semi-major axis of ellipse for tilt calcs"), TILT_MIN("-999",
"[m] semi-minor axis of ellipse for tilt calcs"), TILT_PA("-999",
"[deg] position angle of ellipse for tilt calcs"),
TILT_RAD("-999", "[m]"),
TILT_MAJ("-999", "[m] semi-major axis of ellipse for tilt calcs"),
TILT_MIN("-999", "[m] semi-minor axis of ellipse for tilt calcs"),
TILT_PA("-999", "[deg] position angle of ellipse for tilt calcs"),
// Additional fits tags specific to ancillary fits files
MAP_NAME("FillMeIn", "Map data type"), MAP_TYPE("FillMeIn",
"Defines whether this is a global or local map"), OBJ_FILE("TEMPLATEENTRY", null),
MAP_NAME("FillMeIn", "Map data type"),
MAP_TYPE("FillMeIn", "Defines whether this is a global or local map"),
OBJ_FILE("TEMPLATEENTRY", null),
// keywords specific to facet mapping ancillary file
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"), GSDINDX("-999",
"[unk] Ground sample distance of OBJINDX"), GSDINDXI("-999", "[unk] GSDINDX integer"),
OBJINDX("UNK", "OBJ indexed to OBJ_FILE"),
GSDINDX("-999", "[unk] Ground sample distance of OBJINDX"),
GSDINDXI("-999", "[unk] GSDINDX integer"),
// return this enum when no match is found
NOMATCH(null, "could not determine");
@@ -123,7 +151,6 @@ public enum HeaderTag {
// PIXPDEG(null,"pixels per degree","pixel/deg"),
// PIX_SZ(null, "mean size of pixels at equator (meters)","m");
private FitsValCom fitsValCom;
private HeaderTag(String value, String comment) {
@@ -141,12 +168,9 @@ public enum HeaderTag {
/**
* Contains all valid Fits keywords for this Enum. 'NOMATCH' is not a valid Fits keyword
*/
public static final EnumSet<HeaderTag> fitsKeywords =
EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
public static final EnumSet<HeaderTag> globalDTMFitsData =
EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
public static final EnumSet<HeaderTag> fitsKeywords = EnumSet.range(HeaderTag.SIMPLE, HeaderTag.GSDINDXI);
public static final EnumSet<HeaderTag> globalDTMFitsData = EnumSet.of(HeaderTag.CLAT, HeaderTag.CLON);
/**
* Return the HeaderTag associated with a given string. returns NOMATCH enum if no match found.

View File

@@ -85,7 +85,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.NPRDVERS));
return headers;
}
/**
@@ -102,7 +101,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATASRCF));
return headers;
}
/**
@@ -121,7 +119,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.SOFT_VER));
return headers;
}
/**
@@ -150,7 +147,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCardD(HeaderTag.DQUAL_2));
return headers;
}
/**
@@ -168,7 +164,6 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.DATENPRD));
return headers;
}
/**
@@ -185,7 +180,5 @@ public class NFTmln extends DTMFits implements DTMHeader {
headers.add(fitsHdr.getHeaderCard(HeaderTag.CREATOR));
return headers;
}
}

View File

@@ -34,12 +34,12 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.TimeZone;
import org.apache.commons.io.FilenameUtils;
import nom.tam.fits.BasicHDU;
import nom.tam.fits.Fits;
import nom.tam.fits.FitsException;
import nom.tam.fits.HeaderCard;
import nom.tam.util.Cursor;
import org.apache.commons.io.FilenameUtils;
import terrasaur.enums.FitsHeaderType;
import terrasaur.enums.PlaneInfo;
import terrasaur.fits.FitsHdr.FitsHdrBuilder;
@@ -65,8 +65,7 @@ public class ProductFits {
* @throws FitsException
* @throws IOException
*/
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile)
throws FitsException, IOException {
public static List<HeaderCard> getPlaneHeaderCards(String fitsFile) throws FitsException, IOException {
Fits inFits = new Fits(fitsFile);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inFits);
return planeHeaders;
@@ -81,8 +80,7 @@ public class ProductFits {
* @throws FitsException
* @throws IOException
*/
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile)
throws FitsException, IOException {
public static List<HeaderCard> getPlaneHeaderCards(Fits inFitsFile) throws FitsException, IOException {
BasicHDU inHdu = inFitsFile.getHDU(0);
List<HeaderCard> planeHeaders = getPlaneHeaderCards(inHdu);
return planeHeaders;
@@ -116,8 +114,7 @@ public class ProductFits {
* @throws IOException
* @throws FitsException
*/
public static Map<String, Double> minMaxLLFromFits(File fitsFile)
throws FitsException, IOException {
public static Map<String, Double> minMaxLLFromFits(File fitsFile) throws FitsException, IOException {
System.out.println("Determining minmax lat lon from fits file:" + fitsFile.getAbsolutePath());
@@ -298,8 +295,7 @@ public class ProductFits {
// save outfile name in cross-reference file, for future reference
String path = FilenameUtils.getFullPath(outFname);
if (path.length() == 0) path = ".";
String outBaseName =
String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
String outBaseName = String.format("%s%s%s", path, File.pathSeparator, FilenameUtils.getBaseName(outFname));
AsciiFile crfFile = new AsciiFile(crossrefFile.getAbsolutePath());
crfFile.streamSToFile(outBaseName, 0);
crfFile.closeFile();
@@ -327,10 +323,7 @@ public class ProductFits {
* @throws FitsException
*/
public static void saveDataCubeFits(
double[][][] dataCube,
FitsHdrBuilder hdrBuilder,
List<HeaderCard> planeHeaders,
String outFile)
double[][][] dataCube, FitsHdrBuilder hdrBuilder, List<HeaderCard> planeHeaders, String outFile)
throws FitsException, IOException {
FitsHdr fitsHdr = hdrBuilder.build();
@@ -357,10 +350,7 @@ public class ProductFits {
* @throws FitsException
*/
public static void saveDataCubeFits(
double[][][] dataCube,
List<HeaderCard> headers,
List<HeaderCard> planeKeywords,
String outFname)
double[][][] dataCube, List<HeaderCard> headers, List<HeaderCard> planeKeywords, String outFname)
throws FitsException, IOException {
// append planeHeaders
@@ -414,8 +404,7 @@ public class ProductFits {
DTMHeader nftFitsHeader = FitsHeaderFactory.getDTMHeader(fitsHeader, FitsHeaderType.NFTMLN);
nftFitsHeader.setData(fitsData);
List<HeaderCard> headers =
nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
List<HeaderCard> headers = nftFitsHeader.createFitsHeader(PlaneInfo.planesToHeaderCard(planeList));
System.out.println("Saving to " + outNftFile);
FitsUtil.saveFits(fitsData.getData(), outNftFile, headers);

View File

@@ -23,7 +23,6 @@
package terrasaur.fits;
public enum UnitDir {
UX {
public int getAxis() {
return 1;

View File

@@ -36,9 +36,9 @@ import javafx.scene.control.ChoiceBox;
import javafx.scene.control.Label;
import javafx.scene.control.TextField;
import javafx.stage.Stage;
import spice.basic.SpiceException;
import terrasaur.apps.TranslateTime;
import terrasaur.utils.AppVersion;
import spice.basic.SpiceException;
public class TranslateTimeController implements Initializable {
@@ -56,16 +56,12 @@ public class TranslateTimeController implements Initializable {
// populate SCLK menu
NavigableSet<Integer> sclkIDs = new TreeSet<>(TranslateTimeFX.sclkIDs);
this.sclkChoice.getItems().addAll(sclkIDs);
this.sclkChoice.getSelectionModel().selectedIndexProperty()
.addListener(new ChangeListener<Number>() {
this.sclkChoice.getSelectionModel().selectedIndexProperty().addListener(new ChangeListener<Number>() {
@Override
public void changed(ObservableValue<? extends Number> observable, Number oldValue,
Number newValue) {
public void changed(ObservableValue<? extends Number> observable, Number oldValue, Number newValue) {
tt.setSCLKKernel(sclkChoice.getItems().get((Integer) newValue));
}
});
this.sclkChoice.getSelectionModel().select(0);
@@ -90,8 +86,7 @@ public class TranslateTimeController implements Initializable {
@FXML
private void setJulian() throws NumberFormatException, SpiceException {
if (julianString.getText().trim().length() > 0)
tt.setJulianDate(Double.parseDouble(julianString.getText()));
if (julianString.getText().trim().length() > 0) tt.setJulianDate(Double.parseDouble(julianString.getText()));
updateTime();
}
@@ -103,8 +98,7 @@ public class TranslateTimeController implements Initializable {
@FXML
private void setSCLK() throws SpiceException {
if (sclkString.getText().trim().length() > 0)
tt.setSCLK(sclkString.getText());
if (sclkString.getText().trim().length() > 0) tt.setSCLK(sclkString.getText());
updateTime();
}
@@ -113,8 +107,7 @@ public class TranslateTimeController implements Initializable {
@FXML
private void setTDB() throws SpiceException {
if (tdbString.getText().trim().length() > 0)
tt.setTDB(Double.parseDouble(tdbString.getText()));
if (tdbString.getText().trim().length() > 0) tt.setTDB(Double.parseDouble(tdbString.getText()));
updateTime();
}
@@ -123,8 +116,7 @@ public class TranslateTimeController implements Initializable {
@FXML
private void setTDBCalendar() throws SpiceException {
if (tdbCalendarString.getText().trim().length() > 0)
tt.setTDBCalendarString(tdbCalendarString.getText());
if (tdbCalendarString.getText().trim().length() > 0) tt.setTDBCalendarString(tdbCalendarString.getText());
updateTime();
}
@@ -136,8 +128,7 @@ public class TranslateTimeController implements Initializable {
@FXML
private void setUTC() throws SpiceException {
if (utcString.getText().trim().length() > 0)
tt.setUTC(utcString.getText());
if (utcString.getText().trim().length() > 0) tt.setUTC(utcString.getText());
updateTime();
}
@@ -149,5 +140,4 @@ public class TranslateTimeController implements Initializable {
utcString.setText(tt.toTDB().toUTCString("ISOC", 3));
utcLabel.setText(String.format("UTC (DOY %s)", tt.toTDB().toString("DOY")));
}
}

View File

@@ -62,5 +62,4 @@ public class TranslateTimeFX extends Application {
stage.setScene(scene);
stage.show();
}
}

View File

@@ -22,7 +22,6 @@
*/
package terrasaur.smallBodyModel;
import picante.math.intervals.Interval;
import picante.math.intervals.UnwritableInterval;
import picante.math.vectorspace.UnwritableVectorIJK;
@@ -55,7 +54,9 @@ public class BoundingBox {
* @param bounds
*/
public BoundingBox(double[] bounds) {
this(new Interval(bounds[0], bounds[1]), new Interval(bounds[2], bounds[3]),
this(
new Interval(bounds[0], bounds[1]),
new Interval(bounds[2], bounds[3]),
new Interval(bounds[4], bounds[5]));
}
@@ -66,8 +67,7 @@ public class BoundingBox {
* @param yRange
* @param zRange
*/
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange,
UnwritableInterval zRange) {
public BoundingBox(UnwritableInterval xRange, UnwritableInterval yRange, UnwritableInterval zRange) {
this.xRange = new Interval(xRange);
this.yRange = new Interval(yRange);
this.zRange = new Interval(zRange);
@@ -145,7 +145,8 @@ public class BoundingBox {
* @return
*/
public boolean intersects(BoundingBox other) {
return xRange.closedIntersects(other.xRange) && yRange.closedIntersects(other.yRange)
return xRange.closedIntersects(other.xRange)
&& yRange.closedIntersects(other.yRange)
&& zRange.closedIntersects(other.zRange);
}
@@ -179,8 +180,7 @@ public class BoundingBox {
* @return
*/
public boolean contains(UnwritableVectorIJK pt) {
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ())
&& zRange.closedContains(pt.getK());
return xRange.closedContains(pt.getI()) && yRange.closedContains(pt.getJ()) && zRange.closedContains(pt.getK());
}
/**

View File

@@ -22,6 +22,7 @@
*/
package terrasaur.smallBodyModel;
import com.google.common.collect.HashMultimap;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
@@ -34,11 +35,10 @@ import org.apache.commons.math3.geometry.euclidean.threed.Rotation;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import com.google.common.collect.HashMultimap;
import picante.math.vectorspace.VectorIJK;
import terrasaur.utils.math.MathConversions;
import terrasaur.utils.PolyDataStatistics;
import terrasaur.utils.PolyDataUtil;
import terrasaur.utils.math.MathConversions;
import terrasaur.utils.tessellation.FibonacciSphere;
import vtk.vtkPoints;
import vtk.vtkPolyData;
@@ -51,7 +51,7 @@ import vtk.vtkPolyData;
*/
public class LocalModelCollection {
private final static Logger logger = LogManager.getLogger();
private static final Logger logger = LogManager.getLogger();
static class LocalModel {
final Vector3D center;
@@ -61,7 +61,6 @@ public class LocalModelCollection {
this.center = center;
this.filename = filename;
}
}
// key is tile index, value is collection of localModels
@@ -109,19 +108,18 @@ public class LocalModelCollection {
*/
public SmallBodyModel get(Vector3D point) {
List<String> filenames = getFilenames(point);
if (filenames.size() == 0)
logger.error("No shape models cover {}", point.toString());
if (filenames.size() == 0) logger.error("No shape models cover {}", point.toString());
double[] origin = new double[3];
double[] intersectPoint = new double[3];
for (String filename : filenames) {
SmallBodyModel sbm = load(filename);
long intersect =
sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
long intersect = sbm.computeRayIntersection(origin, point.toArray(), 2 * point.getNorm(), intersectPoint);
if (intersect != -1)
return sbm;
if (intersect != -1) return sbm;
}
logger.debug("Failed intersection for lon {}, lat {}", Math.toDegrees(point.getAlpha()),
logger.debug(
"Failed intersection for lon {}, lat {}",
Math.toDegrees(point.getAlpha()),
Math.toDegrees(point.getDelta()));
return null;
}
@@ -141,8 +139,7 @@ public class LocalModelCollection {
SmallBodyModel sbm = map.get(filename);
if (sbm == null) {
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(),
FilenameUtils.getBaseName(filename));
logger.debug("Thread {}: Loading {}", Thread.currentThread().getId(), FilenameUtils.getBaseName(filename));
try {
vtkPolyData model = PolyDataUtil.loadShapeModel(filename);
if (scale != null || rotation != null) {
@@ -153,9 +150,13 @@ public class LocalModelCollection {
for (int i = 0; i < points.GetNumberOfPoints(); i++) {
Vector3D thisPoint = new Vector3D(points.GetPoint(i));
if (scale != null)
thisPoint = thisPoint.subtract(center).scalarMultiply(scale).add(center);
thisPoint = thisPoint
.subtract(center)
.scalarMultiply(scale)
.add(center);
if (rotation != null)
thisPoint = rotation.applyTo(thisPoint.subtract(center)).add(center);
thisPoint =
rotation.applyTo(thisPoint.subtract(center)).add(center);
points.SetPoint(i, thisPoint.toArray());
}
}
@@ -195,7 +196,8 @@ public class LocalModelCollection {
localDistanceMap.put(thisDist, localModel);
}
// add all local models with centers within PI/4 of point
for (double localDist : localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
for (double localDist :
localDistanceMap.headMap(Math.PI / 4, true).keySet()) {
smallBodyModels.add(localDistanceMap.get(localDist).filename);
}
}
@@ -203,5 +205,4 @@ public class LocalModelCollection {
return smallBodyModels;
}
}

View File

@@ -30,34 +30,34 @@ import terrasaur.smallBodyModel.ImmutableSBMTStructure.Builder;
/**
*
* <pre>
# SBMT Structure File
# type,point
# ------------------------------------------------------------------------------
# File consists of a list of structures on each line.
#
# Each line is defined by 17 columns with the following:
# &lt;id&gt; &lt;name&gt; &lt;centerXYZ[3]&gt; &lt;centerLLR[3]&gt; &lt;coloringValue[4]&gt; &lt;diameter&gt; &lt;flattening&gt; &lt;regularAngle&gt; &lt;colorRGB&gt; &lt;label&gt;*
#
# id: Id of the structure
# name: Name of the structure
# centerXYZ[3]: 3 columns that define the structure center in 3D space
# centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
# coloringValue[4]: 4 columns that define the ellipse &#8220;standard&#8221; colorings. The
# colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
# diameter: Diameter of (semimajor) axis of ellipse
# flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
# regularAngle: Angle between the semimajor axis and the line of longitude
# as projected onto the surface
# colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
# spaces). This column appears as a single textual column.
# label: Label of the structure
#
#
# Please note the following:
# - Each line is composed of columns separated by a tab character.
# - Blank lines or lines that start with '#' are ignored.
# - Angle units: degrees
# - Length units: kilometers
* # SBMT Structure File
* # type,point
* # ------------------------------------------------------------------------------
* # File consists of a list of structures on each line.
* #
* # Each line is defined by 17 columns with the following:
* # &lt;id&gt; &lt;name&gt; &lt;centerXYZ[3]&gt; &lt;centerLLR[3]&gt; &lt;coloringValue[4]&gt; &lt;diameter&gt; &lt;flattening&gt; &lt;regularAngle&gt; &lt;colorRGB&gt; &lt;label&gt;*
* #
* # id: Id of the structure
* # name: Name of the structure
* # centerXYZ[3]: 3 columns that define the structure center in 3D space
* # centerLLR[3]: 3 columns that define the structure center in lat,lon,radius
* # coloringValue[4]: 4 columns that define the ellipse &#8220;standard&#8221; colorings. The
* # colorings are: slope (NA), elevation (NA), acceleration (NA), potential (NA)
* # diameter: Diameter of (semimajor) axis of ellipse
* # flattening: Flattening factor of ellipse. Range: [0.0, 1.0]
* # regularAngle: Angle between the semimajor axis and the line of longitude
* # as projected onto the surface
* # colorRGB: 1 column (of RGB values [0, 255] separated by commas with no
* # spaces). This column appears as a single textual column.
* # label: Label of the structure
* #
* #
* # Please note the following:
* # - Each line is composed of columns separated by a tab character.
* # - Blank lines or lines that start with '#' are ignored.
* # - Angle units: degrees
* # - Length units: kilometers
* </pre>
*
* @author Hari.Nair@jhuapl.edu
@@ -117,8 +117,8 @@ public abstract class SBMTStructure {
String[] parts = line.split("\\s+");
int id = Integer.parseInt(parts[0]);
String name = parts[1];
Vector3D centerXYZ = new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]),
Double.parseDouble(parts[4]));
Vector3D centerXYZ =
new Vector3D(Double.parseDouble(parts[2]), Double.parseDouble(parts[3]), Double.parseDouble(parts[4]));
String slopeColoring = parts[8];
String elevationColoring = parts[9];
String accelerationColoring = parts[10];
@@ -127,8 +127,8 @@ public abstract class SBMTStructure {
double flattening = Double.parseDouble(parts[13]);
double regularAngle = Double.parseDouble(parts[14]);
String[] colorParts = parts[15].split(",");
Color rgb = new Color(Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]),
Integer.parseInt(colorParts[2]));
Color rgb = new Color(
Integer.parseInt(colorParts[0]), Integer.parseInt(colorParts[1]), Integer.parseInt(colorParts[2]));
String label = parts[16];
Builder builder = ImmutableSBMTStructure.builder();
@@ -146,6 +146,4 @@ public abstract class SBMTStructure {
builder.label(label);
return builder.build();
}
}

View File

@@ -39,7 +39,7 @@ import vtk.vtkPolyData;
*/
public class SmallBodyCubes {
private final static Logger logger = LogManager.getLogger(SmallBodyCubes.class);
private static final Logger logger = LogManager.getLogger(SmallBodyCubes.class);
private BoundingBox boundingBox;
private ArrayList<BoundingBox> allCubes = new ArrayList<BoundingBox>();
@@ -55,21 +55,20 @@ public class SmallBodyCubes {
* @param cubeSize
* @param buffer
*/
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer,
boolean removeEmptyCubes) {
public SmallBodyCubes(vtkPolyData smallBodyPolyData, double cubeSize, double buffer, boolean removeEmptyCubes) {
this.cubeSize = cubeSize;
this.buffer = buffer;
initialize(smallBodyPolyData);
if (removeEmptyCubes)
removeEmptyCubes(smallBodyPolyData);
if (removeEmptyCubes) removeEmptyCubes(smallBodyPolyData);
}
private void initialize(vtkPolyData smallBodyPolyData) {
smallBodyPolyData.ComputeBounds();
double[] bounds = smallBodyPolyData.GetBounds();
boundingBox = new BoundingBox(new Interval(bounds[0] - buffer, bounds[1] + buffer),
boundingBox = new BoundingBox(
new Interval(bounds[0] - buffer, bounds[1] + buffer),
new Interval(bounds[2] - buffer, bounds[3] + buffer),
new Interval(bounds[4] - buffer, bounds[5] + buffer));
@@ -86,8 +85,8 @@ public class SmallBodyCubes {
for (int i = 0; i < numCubesX; ++i) {
double xmin = boundingBox.getXRange().getBegin() + i * cubeSize;
double xmax = boundingBox.getXRange().getBegin() + (i + 1) * cubeSize;
BoundingBox bb = new BoundingBox(new Interval(xmin, xmax), new Interval(ymin, ymax),
new Interval(zmin, zmax));
BoundingBox bb = new BoundingBox(
new Interval(xmin, xmax), new Interval(ymin, ymax), new Interval(zmin, zmax));
allCubes.add(bb);
}
}
@@ -188,14 +187,12 @@ public class SmallBodyCubes {
* @return
*/
public int getCubeId(double[] point) {
if (!boundingBox.contains(point))
return -1;
if (!boundingBox.contains(point)) return -1;
int numberCubes = allCubes.size();
for (int i = 0; i < numberCubes; ++i) {
BoundingBox cube = getCube(i);
if (cube.contains(point))
return i;
if (cube.contains(point)) return i;
}
// If we reach here something is wrong

View File

@@ -28,7 +28,6 @@ import java.util.ArrayList;
import java.util.HashSet;
import java.util.Set;
import java.util.TreeSet;
import org.apache.commons.math3.geometry.euclidean.threed.Rotation;
import org.apache.commons.math3.geometry.euclidean.threed.Vector3D;
import org.apache.commons.math3.stat.descriptive.DescriptiveStatistics;
@@ -65,10 +64,11 @@ import vtk.vtksbModifiedBSPTree;
*/
public class SmallBodyModel {
private final static Logger logger = LogManager.getLogger(SmallBodyModel.class);
private static final Logger logger = LogManager.getLogger(SmallBodyModel.class);
public enum ColoringValueType {
POINT_DATA, CELLDATA
POINT_DATA,
CELLDATA
}
private vtkPolyData smallBodyPolyData;
@@ -114,8 +114,12 @@ public class SmallBodyModel {
setSmallBodyPolyData(polyData, coloringValues, coloringNames, coloringUnits, coloringValueType);
}
public void setSmallBodyPolyData(vtkPolyData polydata, vtkFloatArray[] coloringValues,
String[] coloringNames, String[] coloringUnits, ColoringValueType coloringValueType) {
public void setSmallBodyPolyData(
vtkPolyData polydata,
vtkFloatArray[] coloringValues,
String[] coloringNames,
String[] coloringUnits,
ColoringValueType coloringValueType) {
smallBodyPolyData.DeepCopy(polydata);
smallBodyPolyData.BuildLinks(0);
@@ -201,11 +205,9 @@ public class SmallBodyModel {
// comes out to 1 km for Eros.
// Compute bounding box diagonal length of lowest res shape model
double diagonalLength =
new BoundingBox(getLowResSmallBodyPolyData().GetBounds()).getDiagonalLength();
double diagonalLength = new BoundingBox(getLowResSmallBodyPolyData().GetBounds()).getDiagonalLength();
double cubeSize = diagonalLength / 38.66056033363347;
smallBodyCubes =
new SmallBodyCubes(getLowResSmallBodyPolyData(), cubeSize, 0.01 * cubeSize, true);
smallBodyCubes = new SmallBodyCubes(getLowResSmallBodyPolyData(), cubeSize, 0.01 * cubeSize, true);
}
return smallBodyCubes;
@@ -274,8 +276,7 @@ public class SmallBodyModel {
* @return
*/
public double[] getNormalAtPoint(double[] point, double radius) {
return PolyDataUtil.getPolyDataNormalAtPointWithinRadius(point, smallBodyPolyData, pointLocator,
radius);
return PolyDataUtil.getPolyDataNormalAtPointWithinRadius(point, smallBodyPolyData, pointLocator, radius);
}
public double[] getClosestNormal(double[] point) {
@@ -402,8 +403,7 @@ public class SmallBodyModel {
double[] origin = {0.0, 0.0, 0.0};
double[] lookPt = {rect.getI(), rect.getJ(), rect.getK()};
return computeRayIntersection(origin, lookPt, 10 * getBoundingBoxDiagonalLength(),
intersectPoint);
return computeRayIntersection(origin, lookPt, 10 * getBoundingBoxDiagonalLength(), intersectPoint);
}
/**
@@ -428,11 +428,9 @@ public class SmallBodyModel {
data[0][m][n] = lat;
data[1][m][n] = lon;
long cellId =
getPointAndCellIdFromLatLon(Math.toRadians(lat), Math.toRadians(lon), intersectPoint);
long cellId = getPointAndCellIdFromLatLon(Math.toRadians(lat), Math.toRadians(lon), intersectPoint);
double rad = -1.0e32;
if (cellId >= 0)
rad = new VectorIJK(intersectPoint).getLength();
if (cellId >= 0) rad = new VectorIJK(intersectPoint).getLength();
else {
logger.info(String.format("Warning: no intersection at lat:%.5f, lon:%.5f", lat, lon));
}
@@ -472,8 +470,7 @@ public class SmallBodyModel {
* @param intersectPoint (returned)
* @return the cellId of the cell containing the intersect point or -1 if no intersection
*/
public long computeRayIntersection(double[] origin, double[] direction, double distance,
double[] intersectPoint) {
public long computeRayIntersection(double[] origin, double[] direction, double distance, double[] intersectPoint) {
double[] lookPt = new double[3];
lookPt[0] = origin[0] + 2.0 * distance * direction[0];
@@ -487,17 +484,14 @@ public class SmallBodyModel {
int[] subId = new int[1];
long[] cellId = new long[1];
int result = cellLocator.IntersectWithLine(origin, lookPt, tol, t, x, pcoords, subId, cellId,
genericCell);
int result = cellLocator.IntersectWithLine(origin, lookPt, tol, t, x, pcoords, subId, cellId, genericCell);
intersectPoint[0] = x[0];
intersectPoint[1] = x[1];
intersectPoint[2] = x[2];
if (result > 0)
return cellId[0];
else
return -1;
if (result > 0) return cellId[0];
else return -1;
}
/**
@@ -510,8 +504,7 @@ public class SmallBodyModel {
public Vector3D findEastVector(double[] pt) {
// define a topographic frame where the Z axis points up and the Y axis points north. The X axis
// will point east.
Rotation bodyFixedToTopo =
RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
Rotation bodyFixedToTopo = RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
return bodyFixedToTopo.applyTo(Vector3D.PLUS_I);
}
@@ -525,8 +518,7 @@ public class SmallBodyModel {
public Vector3D findWestVector(double[] pt) {
// define a topographic frame where the Z axis points up and the Y axis points north. The X axis
// will point east.
Rotation bodyFixedToTopo =
RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
Rotation bodyFixedToTopo = RotationUtils.KprimaryJsecondary(new Vector3D(pt), Vector3D.PLUS_K);
return bodyFixedToTopo.applyTo(Vector3D.MINUS_I);
}
@@ -557,12 +549,14 @@ public class SmallBodyModel {
double[] pt1 = points.GetPoint(1);
double[] pt2 = points.GetPoint(2);
TriangularFacet facet =
new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
TriangularFacet facet = new TriangularFacet(new VectorIJK(pt0), new VectorIJK(pt1), new VectorIJK(pt2));
stats.addValue(VectorIJK.subtract(facet.getVertex1(), facet.getVertex2()).getLength());
stats.addValue(VectorIJK.subtract(facet.getVertex2(), facet.getVertex3()).getLength());
stats.addValue(VectorIJK.subtract(facet.getVertex3(), facet.getVertex1()).getLength());
stats.addValue(
VectorIJK.subtract(facet.getVertex1(), facet.getVertex2()).getLength());
stats.addValue(
VectorIJK.subtract(facet.getVertex2(), facet.getVertex3()).getLength());
stats.addValue(
VectorIJK.subtract(facet.getVertex3(), facet.getVertex1()).getLength());
points.Delete();
cell.Delete();
@@ -572,24 +566,17 @@ public class SmallBodyModel {
}
public String getModelName() {
if (resolutionLevel >= 0 && resolutionLevel < modelNames.length)
return modelNames[resolutionLevel];
else
return null;
if (resolutionLevel >= 0 && resolutionLevel < modelNames.length) return modelNames[resolutionLevel];
else return null;
}
/** clean up VTK allocated internal objects */
public void delete() {
if (cellLocator != null)
cellLocator.Delete();
if (bspLocator != null)
bspLocator.Delete();
if (pointLocator != null)
pointLocator.Delete();
if (genericCell != null)
genericCell.Delete();
if (smallBodyPolyData != null)
smallBodyPolyData.Delete();
if (cellLocator != null) cellLocator.Delete();
if (bspLocator != null) bspLocator.Delete();
if (pointLocator != null) pointLocator.Delete();
if (genericCell != null) genericCell.Delete();
if (smallBodyPolyData != null) smallBodyPolyData.Delete();
}
public void saveAsPLT(File file) throws IOException {
@@ -607,5 +594,4 @@ public class SmallBodyModel {
public void saveAsSTL(File file) throws IOException {
PolyDataUtil.saveShapeModelAsSTL(smallBodyPolyData, file.getAbsolutePath());
}
}

View File

@@ -58,8 +58,7 @@ public class DefaultTerrasaurTool implements TerrasaurTool {
private static Options defineOptions() {
Options options = TerrasaurTool.defineOptions();
options.addOption(
Option.builder("env")
options.addOption(Option.builder("env")
.hasArgs()
.required()
.desc("Print the named environment variable's value. Can take multiple arguments.")
@@ -78,8 +77,7 @@ public class DefaultTerrasaurTool implements TerrasaurTool {
for (MessageLabel ml : startupMessages.keySet())
logger.info(String.format("%s %s", ml.label, startupMessages.get(ml)));
for (String env : cl.getOptionValues("env"))
logger.info(String.format("%s: %s", env, System.getenv(env)));
for (String env : cl.getOptionValues("env")) logger.info(String.format("%s: %s", env, System.getenv(env)));
logger.info("Finished");
}

View File

@@ -22,17 +22,16 @@
*/
package terrasaur.templates;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.Level;
import terrasaur.utils.AppVersion;
import terrasaur.utils.Log4j2Configurator;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.LinkedHashMap;
import java.util.Map;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.Level;
import terrasaur.utils.AppVersion;
import terrasaur.utils.Log4j2Configurator;
/**
* All classes in the apps folder should implement this interface. Calling the class without
@@ -46,8 +45,7 @@ public interface TerrasaurTool {
/** Show required options first, followed by non-required. */
class CustomHelpFormatter extends HelpFormatter {
public CustomHelpFormatter() {
setOptionComparator(
(o1, o2) -> {
setOptionComparator((o1, o2) -> {
if (o1.isRequired() && !o2.isRequired()) return -1;
if (!o1.isRequired() && o2.isRequired()) return 1;
return o1.getKey().compareToIgnoreCase(o2.getKey());
@@ -132,18 +130,15 @@ public interface TerrasaurTool {
*/
static Options defineOptions() {
Options options = new Options();
options.addOption(
Option.builder("logFile")
options.addOption(Option.builder("logFile")
.hasArg()
.desc("If present, save screen output to log file.")
.build());
StringBuilder sb = new StringBuilder();
for (Level l : Level.values()) sb.append(String.format("%s ", l.name()));
options.addOption(
Option.builder("logLevel")
options.addOption(Option.builder("logLevel")
.hasArg()
.desc(
"If present, print messages above selected priority. Valid values are "
.desc("If present, print messages above selected priority. Valid values are "
+ sb.toString().trim()
+ ". Default is INFO.")
.build());
@@ -198,15 +193,13 @@ public interface TerrasaurTool {
Log4j2Configurator lc = Log4j2Configurator.getInstance();
if (cl.hasOption("logLevel"))
lc.setLevel(Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
lc.setLevel(
Level.valueOf(cl.getOptionValue("logLevel").toUpperCase().trim()));
if (cl.hasOption("logFile")) lc.addFile(cl.getOptionValue("logFile"));
StringBuilder sb =
new StringBuilder(
String.format(
"%s [%s] on %s",
getClass().getSimpleName(), AppVersion.getVersionString(), hostname));
StringBuilder sb = new StringBuilder(
String.format("%s [%s] on %s", getClass().getSimpleName(), AppVersion.getVersionString(), hostname));
startupMessages.put(MessageLabel.START, sb.toString());
sb = new StringBuilder();
@@ -226,5 +219,4 @@ public interface TerrasaurTool {
return startupMessages;
}
}

View File

@@ -24,26 +24,25 @@
package terrasaur.utils;
public class AppVersion {
public final static String lastCommit = "25.04.27";
public static final String lastCommit = "25.07.30";
// an M at the end of gitRevision means this was built from a "dirty" git repository
public final static String gitRevision = "cb0f7f8";
public final static String applicationName = "Terrasaur";
public final static String dateString = "2025-Apr-28 15:06:13 UTC";
public static final String gitRevision = "6212144";
public static final String applicationName = "Terrasaur";
public static final String dateString = "2025-Jul-30 16:05:45 UTC";
private AppVersion() {}
/**
* Terrasaur version 25.04.27-cb0f7f8 built 2025-Apr-28 15:06:13 UTC
* Terrasaur version 25.07.30-6212144 built 2025-Jul-30 16:05:45 UTC
*/
public static String getFullString() {
return String.format("%s version %s-%s built %s", applicationName, lastCommit, gitRevision, dateString);
}
/**
* Terrasaur version 25.04.27-cb0f7f8
* Terrasaur version 25.07.30-6212144
*/
public static String getVersionString() {
return String.format("%s version %s-%s", applicationName, lastCommit, gitRevision);
}
}

View File

@@ -86,7 +86,9 @@ public class Binary16 {
if ((fbits & 0x7fffffff) >= 0x47800000) { // is or must become NaN/Inf
if (val < 0x7f800000) // was value but too large
return sign | 0x7c00; // make it +/-Inf
return sign | 0x7c00 | // remains +/-Inf or NaN
return sign
| 0x7c00
| // remains +/-Inf or NaN
(fbits & 0x007fffff) >>> 13; // keep NaN (and Inf) bits
}
return sign | 0x7bff; // unrounded not quite Inf
@@ -96,9 +98,9 @@ public class Binary16 {
if (val < 0x33000000) // too small for subnormal
return sign; // becomes +/-0
val = (fbits & 0x7fffffff) >>> 23; // tmp exp for subnormal calc
return sign | ((fbits & 0x7fffff | 0x800000) // add subnormal bit
return sign
| ((fbits & 0x7fffff | 0x800000) // add subnormal bit
+ (0x800000 >>> val - 102) // round depending on cut off
>>> 126 - val); // div by 2^(1-(exp-127+15)) and >> 13 | exp=0
}
}

View File

@@ -34,34 +34,33 @@ import java.io.IOException;
*/
public class BinaryUtils {
static public float readFloatAndSwap(DataInputStream is) throws IOException {
public static float readFloatAndSwap(DataInputStream is) throws IOException {
int intValue = is.readInt();
intValue = ByteSwapper.swap(intValue);
return Float.intBitsToFloat(intValue);
}
static public double readDoubleAndSwap(DataInputStream is) throws IOException {
public static double readDoubleAndSwap(DataInputStream is) throws IOException {
long longValue = is.readLong();
longValue = ByteSwapper.swap(longValue);
return Double.longBitsToDouble(longValue);
}
static public void writeFloatAndSwap(DataOutputStream os, float value) throws IOException {
public static void writeFloatAndSwap(DataOutputStream os, float value) throws IOException {
int intValue = Float.floatToRawIntBits(value);
intValue = ByteSwapper.swap(intValue);
os.writeInt(intValue);
}
static public void writeDoubleAndSwap(DataOutputStream os, double value) throws IOException {
public static void writeDoubleAndSwap(DataOutputStream os, double value) throws IOException {
long longValue = Double.doubleToRawLongBits(value);
longValue = ByteSwapper.swap(longValue);
os.writeLong(longValue);
}
// This function is taken from
// http://www.java2s.com/Code/Java/Language-Basics/Utilityforbyteswappingofalljavadatatypes.htm
static public short swap(short value) {
public static short swap(short value) {
int b1 = value & 0xff;
int b2 = (value >> 8) & 0xff;
@@ -70,7 +69,7 @@ public class BinaryUtils {
// This function is taken from
// http://www.java2s.com/Code/Java/Language-Basics/Utilityforbyteswappingofalljavadatatypes.htm
static public int swap(int value) {
public static int swap(int value) {
int b1 = (value >> 0) & 0xff;
int b2 = (value >> 8) & 0xff;
int b3 = (value >> 16) & 0xff;
@@ -108,5 +107,4 @@ public class BinaryUtils {
intValue = swap(intValue);
return Float.intBitsToFloat(intValue);
}
}

View File

@@ -40,8 +40,6 @@ package terrasaur.utils;
// package no.geosoft.cc.util;
/**
* Utility class for doing byte swapping (i.e. conversion between little-endian and big-endian
* representations) of different data types. Byte swapping is typically used when data is read from
@@ -63,8 +61,6 @@ public class ByteSwapper {
return (short) (b1 << 8 | b2 << 0);
}
/**
* Byte swap a single int value.
*
@@ -80,8 +76,6 @@ public class ByteSwapper {
return b1 << 24 | b2 << 16 | b3 << 8 | b4 << 0;
}
/**
* Byte swap a single long value.
*
@@ -101,8 +95,6 @@ public class ByteSwapper {
return b1 << 56 | b2 << 48 | b3 << 40 | b4 << 32 | b5 << 24 | b6 << 16 | b7 << 8 | b8 << 0;
}
/**
* Byte swap a single float value.
*
@@ -115,8 +107,6 @@ public class ByteSwapper {
return Float.intBitsToFloat(intValue);
}
/**
* Byte swap a single double value.
*
@@ -129,64 +119,48 @@ public class ByteSwapper {
return Double.longBitsToDouble(longValue);
}
/**
* Byte swap an array of shorts. The result of the swapping is put back into the specified array.
*
* @param array Array of values to swap
*/
public static void swap(short[] array) {
for (int i = 0; i < array.length; i++)
array[i] = swap(array[i]);
for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
}
/**
* Byte swap an array of ints. The result of the swapping is put back into the specified array.
*
* @param array Array of values to swap
*/
public static void swap(int[] array) {
for (int i = 0; i < array.length; i++)
array[i] = swap(array[i]);
for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
}
/**
* Byte swap an array of longs. The result of the swapping is put back into the specified array.
*
* @param array Array of values to swap
*/
public static void swap(long[] array) {
for (int i = 0; i < array.length; i++)
array[i] = swap(array[i]);
for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
}
/**
* Byte swap an array of floats. The result of the swapping is put back into the specified array.
*
* @param array Array of values to swap
*/
public static void swap(float[] array) {
for (int i = 0; i < array.length; i++)
array[i] = swap(array[i]);
for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
}
/**
* Byte swap an array of doubles. The result of the swapping is put back into the specified array.
*
* @param array Array of values to swap
*/
public static void swap(double[] array) {
for (int i = 0; i < array.length; i++)
array[i] = swap(array[i]);
for (int i = 0; i < array.length; i++) array[i] = swap(array[i]);
}
}

Some files were not shown because too many files have changed in this diff Show More