circuit_knitting.utils.metrics.cross_entropy

cross_entropy(target, obs)[source]

Compute the cross entropy between two distributions.

The cross entropy is a measure of the difference between two probability distributions, defined via: $ -sum_i x_i log y_i $.

Deprecated since version 0.7.0: The function circuit_knitting.utils.metrics.cross_entropy() is deprecated as of circuit-knitting-toolbox 0.7.0. It will be removed no sooner than CKT v0.8.0.

Example: >>> cross_entropy(np.array([0.1, 0.1, 0.3, 0.5]), np.array([0.25, 0.25, 0.25, 0.25])) 1.3862943611198906

Parameters:
  • target – The target feature vector

  • obs – The actually observed feature vector

Returns:

The computed cross entropy

Raises:
  • Exception – The target is not a dict

  • Exception – The target and obs are not numpy arrays

  • Exception – The target is not a numpy array and the obs are not a dict