Properties of SG⁺⁺
- Permissive license: SG⁺⁺ is free and open-source. Its permissive BSD-like license enables integration into existing software without worrying about license issues.
- Portability: SG⁺⁺ runs on Linux, Mac OS, and Windows.
- Rapid prototyping: Bindings to Python, Java, MATLAB, and C++ are available. SG⁺⁺ itself is written in C++ and Python.
- Efficiency: All performance-critical parts are written in C++ to ensure maximum performance.
- Parallelism: SG⁺⁺ supports OpenMP and MPI.
- Modularity: SG⁺⁺ consists of different modules which can be switched off, which helps to reduce binary size and compile time.
Selected Features and Examples
The following list contains only the most important features of SG⁺⁺ and only selected examples.
Function Interpolation
Examples: Quick start example (C++), quick start example (Python)
- Regular sparse grids
- Spatial adaptivity
- Numerous types of sparse grid bases (piecewise linear, B-splines, etc.)
- Dimensional adaptivity with the sparse grid combination technique
- Sophisticated interpolation (hierarchization) methods or via solution of the corresponding linear system
Data Mining & Machine Learning
Examples: Classification example (C++), regression example (C++)
- Clustering of data points
- Sparse grid regression
- Classification
- Support vector machines
Uncertainty Quantification & Quadrature
Example: Density estimation example (C++)
- Sparse grid density estimation
- Quadrature of sparse grid functions with various basis types
- Estimation of expected values and variances
- Built-in probability distributions
- Propagation of fuzzy uncertainties with the fuzzy extension principle
Partial Differential Equations (PDEs)
- Solvers for various PDEs (heat equation, Poisson equation, etc.)
- Usage of external solvers with the sparse grid combination technique
Function Optimization
Examples: Optimization example (C++), optimization example (Python)
- Gradient-based optimization methods (steepest descent, Newton, etc.)
- Gradient-free optimization methods (Nelder-Mead, Differential Evolution, etc.)
- Constrained optimization methods (Augmented Lagrangian)
- Adaptivity criteria tailored to optimization (Ritter-Novak)