Here we list several codes which we have developed and used for related publications

MFNETS: multifidelity networked surrogates

This file contains the routines necessary to generate networked surrogate models for multifidelity information fusion. It forms the basis of the routines used to generate the results of this paper

  • Gorodetsky, A. A., Jakeman, J. D., and Geraci, G. "MFNets: Learning network representations for multifidelity surrogates." 2020,

The code can be found on GitHub. Autogenerated documentation is available here.

Compressed Continuous Computation (C3)

The Compressed Continuous Computation (C3) library (in C) enables computation with multivariate functions. Many multilinear algebra computations are included. Common tasks include addition, multiplication, integration, differentiation, and approximation of multivariate functions.

Doxygen based documentation is available here

Stochastic Optimal Control with Compressed Continuous Computation (C3SC)

The C3SC add-on to the C3 package provides utilities for solving general stochastic optimal control problems. The included algorithms can handle non-affine controls and non-quadratic costs. The underlying theory is based on low-rank representations of value functions.

Experimental design for Gaussian process regression (GPEXP)

GPEXP is a software package, written in python2.7, for performing experimental design in the context of Gaussian process (GP) regression. Experimental design may be performed for a variety of cost function specifications. Currently supported cost functions include those based on integrated variance, conditional entropy, and mutual information. GPEXP may also be used for general purpose GP regression. Currently supported kernels include the isotropic and anisotropic squared exponential kernel, the isotropic Matern kernel, and the Mehler kernel. Additional kernels may be easily specified. GPEXP also includes optimization routines for estimating kernel hyperparameters from data.


A number of our algorithmic works have entered other software packages and have found utility in other application areas. Below we list these software packages and how they relate to our work. These resources are excellent entry points to gain exposure in our latest algorithms.

Dakota (by Sandia National Laboratories)

The Dakota project is the industry state-of-the-art software package for complex analysis having to do with uncertainty quantification of computational models. The recent releases have included C3 as a third party library. As a result, it is not possible to perform extensive UQ analysis with functional tensor trains.

PyApprox (from John Jakeman at Sandia National Laboratories)

Many of the multifidelity uncertainty quantification algorithms that we have published are implemented in PyApprox. In particular, this package provides implementation of ACV and MFNets. The package also includes tools for wide ranging areas of uncertainty quantification

MXMCPy (from Geoffrey Bomarito, James Warner, Patrick Leser, William Leser, and Luke Morrill at NASA Langley)

This package implements a very wide range of multi-fidelity (multi-model) Monte Carlo sampling algorithms. In particular it provides extensive realizations as well as extenstions of the Approximate Control Variate. The GitHub is here.

Copyright (c) 2020, Alex Gorodetsky, License: CC BY-SA 4.0