Skip to main content
Version: 0.6 (Latest)

Integration Guide

Overview

The Vantor Raptor Guide SDK is used for determining absolute positioning of airborne vehicles. This is achieved by performing visual real-time matching between images captured from onboard camera sensors and high-resolution 3D maps (Vantor Vivid Terrain 3DSM and/or Vantor WorldView 3D). Raptor Guide performs separate camera pose estimations for each individual frame and has no knowledge of either previous frames or the camera's rotation relative to the aircraft's orientation. In this way, Raptor Guide can be used to estimate position and attitude of the camera but not heading and velocity for the aircraft. Therefore, Raptor Guide should not be used as a standalone navigation system, but is rather designed to be used as a component to be integrated and fused with other sensors such as IMU, magnetometer, barometric sensors, etc., in a complete navigation system.

Raptor Guide's ability to independently estimate the position using onboard processing of 3D maps makes it an ideal component to create GNSS-independent navigation systems, which can operate effectively even in environments where GNSS signals are denied or jammed. Hence, integrating Raptor Guide with other sensors enables the development of a robust and accurate navigation system even without the use of GNSS.

This guide covers the essential integration steps and best practices for incorporating the SDK into your application.

SDK Distribution and Dependencies

Shared Library Distribution

The Raptor Guide SDK is distributed as a shared library:

  • Library file: lib/libraptor.guide.so
  • Header files: Located in the include/ directory
  • Resources: Geo resources in resources/geo/ directory (shipped as a separate package but required for SDK functionality)

Python Bindings

Python bindings are available as a wheel package. You can either install the pre-built wheel directly or build from source:

  • Pre-built wheel: See python_sdk/ in the release package for installation instructions
  • Build from source: See examples/python/guide/bindings/in the documentation package for build instructions
  • Usage examples: See examples/python/guide/usage/ in the documentation package for usage examples

GPU Requirements

The SDK uses Vulkan for GPU acceleration for rendering and computations. See Vulkan for:

  • Vulkan version and feature requirements
  • How to verify GPU support
  • GPU selection on multi-GPU systems

Licensing Overview

A valid license is required to use the Raptor Guide SDK. The path to the license file must set in the configuration.

For a detailed description of fingerprint generation, license installation, replacement, and error handling, see the License Management Guide.

Quick Start

1. Basic Integration

Below is an example of the most basic integration of the SDK. All relative paths are relative to the raptor_guide_usage_example directory. Recommendation for real applications is to use absolute paths for the map and geo resource paths.

#include "Guide.hpp"

// Step 1: Configure the SDK
raptor::guide::Config config{
.imageWidth = 1920, // Image width in pixels (REQUIRED)
.imageHeight = 1080, // Image height in pixels (REQUIRED)
.licensePath = "/path/to/license.license", // Path to license file (REQUIRED)
.mapPaths = {"data/example/map/vricon_3d_surface_model/data/db.r3db"},
.geoResourcePath = "data/resources/geo", // Path to geo resources (included with SDK)
.coordinateSystem = {
.referenceFrame = raptor::ReferenceFrame::Geodetic // or ECEF
}
};

// Step 2: Initialize the SDK with configuration
raptor::Guide guide{config};

// Step 3: Process image with camera parameters and initial pose
auto result = guide.updatePose(imageData, hFov, vFov, position, attitude);

// Step 4: Validate results
if (result.result == raptor::Result::Ok && result.confidence > 0.75) {
// Use refined pose: result.position, result.attitude
}

Alternative: Position-only estimation

auto result = guide.updatePosition(imageData, hFov, vFov, position, attitude);
// Returns PositionOutput with refined position only (no updated attitude)

2. Complete Examples

Complete C++ examples are available in the documentation package under examples/cpp/guide/. These include basic and advanced usage demonstrating covariance, time limits, and geodetic coordinates.

See also Conventions for detailed coordinate system and attitude representation guide.

Coordinate Systems

The SDK supports two reference frames. Choose based on your application's coordinate system:

ECEF (Earth-Centered Earth-Fixed)

  • Position: [X, Y, Z] in meters from Earth's center
  • Attitude: Rotation relative to ECEF X, Y, Z axes
  • Example: {1.06742e+06, -4.84072e+06, 4.00057e+06} meters
  • Position: [Latitude, Longitude, Height] (see Geodetic Coordinate Options below for details on units and vertical datum)
  • Attitude: Rotation relative to local NED (North-East-Down) frame
  • Recommended when: Covariance matrix is provided (pose search in north-east plane is more robust)
  • Example: {0.6894, -1.352, 1542} (39.5°, -77.5°, 1542m)

Geodetic Coordinate Options

When using the Geodetic reference frame, you can customize how coordinates are represented through geodetic options. These options control the units and vertical datum used for coordinate representation.

Horizontal Unit (Latitude/Longitude)

Controls the angular unit for latitude and longitude values:

  • RADIAN (default): Latitude and longitude in radians

    • Range: Latitude [-π/2, π/2], Longitude [-π, π]
    • Example: {0.6894, -1.352, 1542} (39.5°, -77.5°, 1542m)
  • DEGREE: Latitude and longitude in degrees

    • Range: Latitude [-90, 90], Longitude [-180, 180]
    • Example: {39.5, -77.5, 1542} (same location as above)

Vertical Unit (Height/Altitude)

Controls the unit of measurement for height above the reference surface:

  • METER (default): Height in meters

    • Example: {0.6894, -1.352, 1542} (1542 meters above ellipsoid/geoid)
  • FOOT: Height in feet

    • Example: {0.6894, -1.352, 5059} (same location, ~5059 feet above ellipsoid/geoid)

Vertical Datum (Height Reference)

Controls the reference surface for height measurements. See Conventions for details.

  • ELLIPSOID (default): Use when reference heights come from GPS/GNSS measurements
  • EGM2008: Use when reference heights come from surveyed elevation data, barometric altimeters, or topographic maps

Configuration Examples

Default (Radians, Meters, Ellipsoid):

raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "resources/geo",
.coordinateSystem = {
.referenceFrame = raptor::ReferenceFrame::Geodetic
// GeodeticOptions uses defaults: RADIAN, METER, ELLIPSOID
}
};

Custom: Degrees, Feet, Geoid:

raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "resources/geo",
.coordinateSystem = {
.referenceFrame = raptor::ReferenceFrame::Geodetic,
.geodeticOptions = {
.horizontalUnit = raptor::HorizontalUnit::Degree,
.verticalUnit = raptor::VerticalUnit::Foot,
.verticalDatum = raptor::VerticalDatum::EGM2008
}
}
};

Configuration Parameters

Required Configuration

raptor::guide::Config config{
.licensePath = "/path/to/license.license", // REQUIRED: Path to license file
.mapPaths = {"path/to/map.r3db"}, // REQUIRED: At least one map path
.geoResourcePath = "resources/geo" // REQUIRED: Path to geo resources
};

// Image dimensions can be set in config or at runtime before creating Guide
config.imageWidth = 1920; // REQUIRED (before Guide construction): Image width in pixels
config.imageHeight = 1080; // REQUIRED (before Guide construction): Image height in pixels

Performance Tuning

raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "resources/geo",
.detailStage1 = 8, // Initial search detail (recommended: 7-8)
.detailStage2 = 9, // Final refinement detail (recommended: 8-9)
.coordinateSystem = {
.referenceFrame = raptor::ReferenceFrame::Geodetic
}
};

Performance Guidelines:

  • Fast processing: detailStage1=7, detailStage2=8
  • Balanced (default): detailStage1=8, detailStage2=9
  • Higher accuracy: detailStage1=9, detailStage2=10

Advanced Settings

raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "resources/geo",
.confidenceInterval = 0.95f, // Confidence level (0.68=1σ, 0.95=2σ)
.coordinateSystem = {
.coordinateEpoch = 2025.0f // Optional: tectonic shift correction
}
};

Loading Configuration from JSON

The SDK provides a parseConfigFromJson() function to load configuration from a JSON file instead of hardcoding values in C++. This is useful for:

  • Deploying different configurations without recompiling
  • Managing multiple configuration profiles
  • Simplifying configuration management in production systems

JSON Configuration Format

{
"licensePath": "/path/to/license.license",
"mapPaths": ["/path/to/map1.r3db", "/path/to/map2.r3db"],
"geoResourcePath": "/path/to/geo/resources",
"imageWidth": 1920,
"imageHeight": 1080,
"detailStage1": 8,
"detailStage2": 9,
"confidenceInterval": 0.9,
"coordinateSystem": {
"referenceFrame": "Geodetic",
"geodeticOptions": {
"verticalDatum": "Ellipsoid",
"horizontalUnit": "Radian",
"verticalUnit": "Meter"
}
}
}

Required Fields:

  • licensePath: Path to license file
  • mapPaths: Array of paths to 3D map files (.r3db or .3tz)
  • geoResourcePath: Path to geo resources directory

Note: While these fields are required with valid values in the Config struct, they can be left empty in the JSON file (licensePath and geoResourcePath as an empty string, mapPaths as an empty array). If empty, you must set them at runtime before initializing the Guide object. Parser will fail if they are not set at all.

Optional Fields:

  • imageWidth: Width of input images in pixels (can be set in config or at runtime before Guide construction)
  • imageHeight: Height of input images in pixels (can be set in config or at runtime before Guide construction)
  • detailStage1: Detail level for initial search (default: 8)
  • detailStage2: Detail level for refinement (default: 9)
  • confidenceInterval: Confidence level for covariance (default: 0.9)
  • coordinateSystem: Coordinate system configuration (default: Geodetic with defaults)

Coordinate System Options:

  • referenceFrame: "ECEF" or "Geodetic" (default: "Geodetic")
  • coordinateEpoch: Decimal year for tectonic correction (default: not used)
  • geodeticOptions: Geodetic coordinate options (only applies when referenceFrame is "Geodetic")
    • verticalDatum: "Ellipsoid" or "EGM2008" (default: "Ellipsoid")
    • horizontalUnit: "Radian" or "Degree" (default: "Radian")
    • verticalUnit: "Meter" or "Foot" (default: "Meter")

Usage Example

#include "Guide.hpp"
#include <iostream>

// Load configuration from JSON file
auto result = raptor::parseConfigFromJson("config.json");

if (std::holds_alternative<raptor::guide::Config>(result)) {
// Successfully parsed configuration
auto config = std::get<raptor::guide::Config>(result);

// Override or set required fields at runtime if they were empty in JSON
if (config.licensePath.empty()) {
config.licensePath = "/path/to/license.license";
}
if (config.mapPaths.empty()) {
config.mapPaths = {"/path/to/map.r3db"};
}
if (config.geoResourcePath.empty()) {
config.geoResourcePath = "/path/to/geo/resources";
}

// Image dimensions can be set in config or at runtime
config.imageWidth = 1920; // Override if needed
config.imageHeight = 1080;

// Initialize Guide with loaded configuration
raptor::Guide guide{config};

// Process images...
auto result = guide.updatePose(imageData, hFov, vFov, position, attitude);

} else {
// Failed to parse configuration
auto error = std::get<raptor::guide::ConfigParseError>(result);
// Log error and handle appropriately
}

Configuration Examples

Example 1: Basic Configuration (Geodetic, Radians, Meters, Ellipsoid)

{
"licensePath": "/path/to/license.license",
"mapPaths": ["/path/to/map.r3db"],
"geoResourcePath": "/path/to/geo/resources"
}

Example 2: Custom Geodetic Options (Degrees, Feet, Geoid)

{
"licensePath": "/path/to/license.license",
"mapPaths": ["/path/to/map.r3db"],
"geoResourcePath": "/path/to/geo/resources",
"coordinateSystem": {
"referenceFrame": "Geodetic",
"geodeticOptions": {
"horizontalUnit": "Degree",
"verticalUnit": "Foot",
"verticalDatum": "EGM2008"
}
}
}

Example 3: ECEF Configuration with Performance Tuning

{
"licensePath": "/path/to/license.license",
"mapPaths": ["/path/to/map1.r3db", "/path/to/map2.r3db"],
"geoResourcePath": "/path/to/geo/resources",
"detailStage1": 7,
"detailStage2": 8,
"confidenceInterval": 0.95,
"coordinateSystem": {
"referenceFrame": "ECEF"
}
}

Example 4: High-Precision Configuration with Tectonic Correction

{
"licensePath": "/path/to/license.license",
"mapPaths": ["/path/to/map.r3db"],
"geoResourcePath": "/path/to/geo/resources",
"detailStage1": 9,
"detailStage2": 10,
"confidenceInterval": 0.99,
"coordinateSystem": {
"referenceFrame": "Geodetic",
"coordinateEpoch": 2025.5,
"geodeticOptions": {
"horizontalUnit": "Degree",
"verticalUnit": "Meter",
"verticalDatum": "Ellipsoid"
}
}
}

Input Requirements

Image Data

  • Format: Grayscale, 8-bit, single channel
  • Layout: Row-major, top-to-bottom, left-to-right
  • Size: Must match config.imageWidth × config.imageHeight exactly
  • Memory: Contiguous array of imageWidth × imageHeight bytes
  • Preprocessing: Undistort and convert to grayscale before calling updatePose()

Camera Parameters

  • Field of View: Must match actual camera calibration. Wide lenses not recommended. Fisheye lenses not supported.
  • Units: Radians

Initial Pose

  • Position: 3-element array in chosen reference frame
  • Attitude: Unit quaternion [x, y, z, w] (scalar component last)

Error Handling

Initialization Errors

The initialization process throws exceptions for errors. Always wrap in try-catch blocks.

try {
raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "path/to/geo/resources"
};
raptor::Guide guide{config};
} catch (const std::runtime_error& e) {
// Initialization failed - check paths, GPU resources, etc.
std::cerr << "SDK initialization failed: " << e.what() << std::endl;
} catch (const std::invalid_argument& e) {
// Invalid configuration parameters
std::cerr << "Invalid configuration: " << e.what() << std::endl;
}

Result Validation (Two-Step Process)

updatePose() and updatePosition() have internal error handling and do not throw exceptions. Always check both result and confidence before using the output data.

auto result = guide.updatePose(imageData, hFov, vFov, position, attitude);

// Step 1: Check operation status
if (result.result == raptor::Result::Ok) {
// Step 2: Check quality before using
if (result.confidence > 0.7) {
// confidence ok
if (result.confidence < 0.85) {
logInfo("Moderate quality pose estimate - use with caution");
}
useRefinedPose(result.position, result.attitude);
} else {
logInfo("Poor quality - reject result");
continue; // Skip to next image
}
} else {
// Operation failed - do not use result
logError("Pose estimation failed");
continue; // Skip to next image
}

Covariance Matrix Usage

Reference Frame Recommendation

Using ReferenceFrame::Geodetic is strongly recommended when providing a covariance matrix. The pose search in the local NED frame is more robust than searching in the ECEF frame.

Matrix Layout

The PoseCovariance is a 6×6 matrix in row-major order. The diagonal elements represent variances for each pose dimension. The interpretation depends on the configured reference frame:

Geodetic Reference Frame (recommended):

IndexComponentUnitDescription
[0]NorthPosition variance in North direction
[7]EastPosition variance in East direction
[14]DownPosition variance in Down direction
[21]Yawrad²Rotation variance about Down axis
[28]Pitchrad²Rotation variance about East axis
[35]Rollrad²Rotation variance about North axis

ECEF Reference Frame:

IndexComponentUnitDescription
[0]XPosition variance along ECEF X axis
[7]YPosition variance along ECEF Y axis
[14]ZPosition variance along ECEF Z axis
[21]Rot-Zrad²Rotation variance about ECEF Z axis
[28]Rot-Yrad²Rotation variance about ECEF Y axis
[35]Rot-Xrad²Rotation variance about ECEF X axis

Attitude interpretation

The attitude is interpreted as euler angles in the ZYX rotation sequence, also known as yaw-pitch-roll. The variance values from the covariance matrix are applied additively and subtractively to the estimated input attitude, meaning they are not relative to the camera frame, but rather the reference frame.

For more detail on how the attitude is represented, see Conventions.

Example to get max uncertainty in attitude:

auto [yaw, pitch, roll] = quaternionToEulerZYX(input.attitude);
auto [maxDiffYaw, maxDiffPitch, maxDiffRoll] = searchRangeFromCovariance(input.covariance);

std::cout << "Search within the following ranges:" << std::endl;
std::cout << "Yaw: " << yaw << " ± " << maxDiffYaw << std::endl;
std::cout << "Pitch: " << pitch << " ± " << maxDiffPitch << std::endl;
std::cout << "Roll: " << roll << " ± " << maxDiffRoll << std::endl;

Understanding Covariance Impact

The covariance matrix is optional but highly recommended for robust pose estimation:

// WITHOUT covariance (fastest, but limited)
auto result = guide.updatePose(imageData, hFov, vFov, position, attitude);
// → Assumes low uncertainty, processes single hypothesis only

// WITH covariance (recommended for uncertain poses)
std::array<double, 36> covariance = createCovarianceMatrix(/* uncertainties */);
auto result = guide.updatePose(imageData, hFov, vFov, position, attitude, covariance);
// → Search range proportional to uncertainty values

Search Range Calculation

The search range for pose estimation is determined by the covariance values and the configured confidenceInterval in Config.hpp. The relationship uses the chi-squared distribution. The confidenceInterval controls how much of your uncertainty region to search, not how confident you are in your covariance values.

For each pose dimension, the search range is calculated as:

searchRange = ±sqrt(chiSquaredValue × variance)

where chiSquaredValue is determined by the confidence interval and degrees of freedom (6 for pose).

Confidence Interval and Chi-Squared Values

The following table shows common confidence intervals and their corresponding chi-squared values for 6 degrees of freedom. These values are computed using the Wilson-Hilferty approximation:

Confidence IntervalChi-Squared Value (χ²)Description
0.68~7.011-sigma (68% confidence)
0.90~10.62Default, good balance
0.95~12.572-sigma (95% confidence)
0.99~16.83High confidence

Example: Calculating Search Range

Scenario: You want a search range of ±100m in each horizontal direction with a 95% confidence interval.

For 95% confidence, χ² ≈ 12.57. To achieve a 100m search range:

searchRange = sqrt(chiSquared × variance)
100 = sqrt(12.57 × variance)
variance = 100² / 12.57 ≈ 795.5 m²
standardDeviation = sqrt(795.5) ≈ 28.2 m

So you would set the position variance to approximately 795 m² (or equivalently, a standard deviation of ~28 m) in the covariance matrix.

Code example

// For a 95% confidence interval with ~28m standard deviation:
// searchRange = sqrt(12.57 × 795) = sqrt(9993) ≈ 100m
double northVar = 795.5; // variance ≈ 795 m² (std dev ≈ 28m)
double eastVar = 795.5; // variance ≈ 795 m² (std dev ≈ 28m)

// The algorithm will search within ±100m of the initial position
// in each horizontal direction at 95% confidence level

Performance Impact

  • Larger uncertainties: More hypotheses tested, longer processing time
  • Smaller uncertainties: Fewer hypotheses, faster processing but may not find correct pose if too small
  • No covariance: Single hypothesis only, fastest processing. Suitable when initial pose is accurate and only needs fine-tuning.

Performance Optimization

// GOOD: Create once, use multiple times
raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "path/to/geo/resources"
};
raptor::Guide guide{config};

for (const auto& image : images) {
auto result = guide.updatePose(image.data, hFov, vFov, position, attitude);
if (result.result == raptor::Result::Ok && result.confidence > 0.75) {
// Process high-quality result...
}
}

// AVOID: Creating new instance for each image (very expensive) and/or multiple instances of Guide (GPU resource conflicts possible)

Time Limits for Real-Time Applications

Depending on the platform and the magnitude of the uncertainties, the processing time can vary. It is recommended to set a time limit for real-time applications. The algorithm will return the best result found within the time limit.

// Set maximum processing time
auto maxTime = std::chrono::milliseconds(1000); // 1 second limit
auto result = guide.updatePose(imageData, hFov, vFov, position, attitude,
std::nullopt, maxTime);

if (result.result == raptor::Result::Ok) {
// Algorithm returned best result found within time limit
}

Thread Safety Considerations

RECOMMENDED: Single Instance Pattern

// Single instance with sequential processing (optimal performance)
raptor::guide::Config config{
.imageWidth = 1920,
.imageHeight = 1080,
.licensePath = "/path/to/license.license",
.mapPaths = {"path/to/map.r3db"},
.geoResourcePath = "path/to/geo/resources"
};
raptor::Guide guide{config};
// Process images one at a time for best resource utilization

Important Notes:

  • NOT thread-safe: Never share a Guide instance across threads
  • GPU contention: Multiple instances should be avoided as they compete for GPU memory and compute

Quality Assessment

Confidence (Quality Metric)

The Confidence score is a quality metric for assessing pose estimation results. It combines the reprojection error with an estimate of the match quality between input image and rendered texture.

if (result.result == raptor::Result::Ok) {
if (result.confidence > 0.95) {
// Excellent quality - high confidence result
} else if (result.confidence > 0.85) {
// Good quality - reliable for most applications
} else if (result.confidence > 0.7) {
// Moderate quality - use with caution
} else {
// Poor quality - recommended to reject (no robust match found)
}
}

Quality Guidelines:

  • > 0.95: Excellent - very high confidence
  • 0.85-0.95: Good - reliable for most applications
  • 0.7-0.85: Moderate - use with caution, monitor consistency
  • < 0.7: Poor - reject result, likely no good match found

Common Issues & Solutions

Initialization Failures

Problem: SDK initialization throws exception

Solutions:

  • Verify geoResourcePath points to valid geo resources directory
  • Ensure map files (.r3db or .3tz) exist and are readable
  • Check available RAM/GPU memory (minimum 4GB required, 8GB recommended)
  • Verify Vulkan-compatible GPU and drivers
  • Use absolute paths for reliability

Poor Quality Results

Problem: Low Confidence (< 0.7) or Result::Failed

Root Causes & Solutions:

  1. Initial pose too inaccurate

    • If the initial pose is too far off, the algorithm may fail to find a good match
    • Provide covariance matrix reflecting actual uncertainty
  2. Camera calibration mismatch

    • Verify FOV values match actual camera calibration
    • Make sure image has been undistorted, i.e. the camera's intrinsic parameters need to be calibration and compensated before calling updatePose()
  3. Image quality issues

    • Ensure sufficient contrast and texture in image
    • Check for motion blur or focus issues
  4. Map coverage problems

    • Verify 3D map covers your area of interest
    • Check map resolution is sufficient for your altitude
  5. Correlation mismatch between image and map

    • If the image contains water bodies and/or sky, the correlation with the map will be poor.
    • Verify the map covers the area with sufficient detail
    • If the terrain has changed since the map was created (e.g. construction, natural disasters), the correlation will also be poor.

Covariance Configuration Issues

Problem: Estimation failures with uncertain initial poses

Solutions:

  • Always provide covariance when pose uncertainty is high
  • Set variance values to match actual uncertainty (not smaller)
  • Monitor success rates and adjust covariance accordingly

Performance Issues

Problem: Processing too slow for real-time applications

Solutions:

  • Set maxTime parameter for time-limited processing
  • Reduce detailStage1 and detailStage2 values
  • Provide appropriate covariance to focus search area

Best Practices Summary

Key Recommendations

  • Always provide covariance matrix when pose uncertainty is high
  • Use Geodetic reference frame for navigation applications with covariances
  • Set realistic time limits for real-time applications (1-10 seconds)
  • Monitor Confidence - reject results with Confidence < 0.7. Weight the Confidence value based on the application's requirements.
  • Consistency monitoring: Track Confidence values across sequential frames
  • Instance management: Avoid creating multiple instances of Guide.

Reference

  • Complete Examples: See the examples/ directory in the documentation package
  • Coordinate Systems: See Conventions for detailed explanations
  • API Documentation: See header files in include/raptor/guide/