Utilities

class optbayesexpt.obe_utils.MeasurementSimulator(model_function, true_params, cons, noise_level)[source]

Bases: object

Provides simulated measurement data

Evaluates the model function and adds noise.

Parameters
  • model_function (func) – Generally the same as the function used by OptBayesExpt

  • true_params (tuple) – Parameter values, typically the “true values” of the simulated experiment.

  • cons (tuple) – The constants.

  • noise_level (float) – standard deviation of the added noise.

simdata(setting, params=None, noise_level=None)[source]

Simulate a measurement

Parameters
  • setting (tuple of floats) – The setting values

  • params (tuple of floats) – if not None, temporarily used instead of the initial values. (opt)

  • noise_level (float) – if not None, temporarily used instead of the initial value. (opt)

Returns

Simulated measurement value(s)

optbayesexpt.obe_utils.differential_entropy(values, window_length=None, base=None, axis=0, method='auto')[source]

Given a sample of a distribution, estimate the differential entropy.

This code is copied from scipy.stats with reformatted docstrings. When the module is loaded, __init__.py attempts to import differential_entropy() from scipy.stats, and loads this version from obe_utils.py if an ImportError is raised.

Several estimation methods are available using the method parameter. By default, a method is selected based the size of the sample.

Parameters
  • values (sequence) – Samples from a continuous distribution.

  • window_length (int, optional) –

    Window length for computing Vasicek estimate. Must be an integer between 1 and half of the sample size. If None (the default), it uses the heuristic value

    \[\left \lfloor \sqrt{n} + 0.5 \right \rfloor\]

    where \(n\) is the sample size. This heuristic was originally proposed in 2 and has become common in the literature.

  • base (float, optional) – The logarithmic base to use, defaults to e (natural logarithm).

  • axis (int, optional) – The axis along which the differential entropy is calculated. Default is 0.

  • method – {‘vasicek’, ‘van es’, ‘ebrahimi’, ‘correa’, ‘auto’}, optional The method used to estimate the differential entropy from the sample. Default is 'auto'. See Notes for more information.

Returns

The calculated differential entropy.

Return type

entropy (float)

Notes

This function will converge to the true differential entropy in the limit

\[n \to \infty, \quad m \to \infty, \quad \frac{m}{n} \to 0\]

The optimal choice of window_length for a given sample size depends on the (unknown) distribution. Typically, the smoother the density of the distribution, the larger the optimal value of window_length 1. The following options are available for the method parameter.

  • 'vasicek' uses the estimator presented in 1. This is one of the first and most influential estimators of differential entropy.

  • 'van es' uses the bias-corrected estimator presented in 3, which is not only consistent but, under some conditions, asymptotically normal.

  • 'ebrahimi' uses an estimator presented in 4, which was shown in simulation to have smaller bias and mean squared error than the Vasicek estimator.

  • 'correa' uses the estimator presented in 5 based on local linear regression. In a simulation study, it had consistently smaller mean square error than the Vasiceck estimator, but it is more expensive to compute.

  • 'auto' selects the method automatically (default). Currently, this selects 'van es' for very small samples (<10), 'ebrahimi' for moderate sample sizes (11-1000), and 'vasicek' for larger samples, but this behavior is subject to change in future versions.

All estimators are implemented as described in 6.

References

1(1,2)

Vasicek, O. (1976). A test for normality based on sample entropy. Journal of the Royal Statistical Society: Series B (Methodological), 38(1), 54-59.

2

Crzcgorzewski, P., & Wirczorkowski, R. (1999). Entropy-based goodness-of-fit test for exponentiality. Communications in Statistics-Theory and Methods, 28(5), 1183-1202.

3

Van Es, B. (1992). Estimating functionals related to a density by a class of statistics based on spacings. Scandinavian Journal of Statistics, 61-72.

4

Ebrahimi, N., Pflughoeft, K., & Soofi, E. S. (1994). Two measures of sample entropy. Statistics & Probability Letters, 20(3), 225-234.

5

Correa, J. C. (1995). A new estimator of entropy. Communications in Statistics-Theory and Methods, 24(10), 2439-2449.

6

Noughabi, H. A. (2015). Entropy Estimation Using Numerical Methods. Annals of Data Science, 2(2), 231-241. https://link.springer.com/article/10.1007/s40745-015-0045-9

optbayesexpt.obe_utils.trace_sort(settings, measurements)[source]

Combine measurements at identical settings values

Analyzes input arrays of setttings and corresponding measurement values, data where settings values may repeat, i. e. more than one measurement was done at some of the settings. The function bins the measurements by setting value and calculates some statistics for measurments in each bin.

Parameters
  • settings – (ndarray) Setting values

  • measurements – (ndarray) measurement values

Returns

A tuple, (sorted_settings, m_average, m_std, n_of_m)
  • sorted_settings (list): setting values (sorted, none repeated)

  • m_average (list): average measurement value at each setting

  • m_sigma (list): standard deviation of measurement values at each setting

  • n_of_m (list): number of measurements at each setting.