The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
The entropy_kde2d()
function estimates the Shannon
entropy for a two-dimensional dataset using kernel density estimation
(KDE). This function provides a non-parametric measure of entropy,
useful for analyzing the uncertainty or randomness in bivariate
distributions, such as spatial data of animal trajectories (e.g, Maei et al., 2009).
The parameters for entropy_kde2d()
are as follows:
x
: A numeric vector representing the random variable
in the first dimension.
y
: A numeric vector representing the random variable
in the second dimension.
n_grid
: A numeric value specifying the number of
grid cells used to evaluate the density; defaults to 150. Higher values
result in finer density grids but increase computational
complexity.
The function outputs a single numeric value representing the entropy of the given data.
set.seed(123)
# Generate a 2D normal distribution with a correlation of 0.6
n <- 1000
mean <- c(0, 0)
sd_x <- 1
sd_y <- 5
correlation <- 0.6
sigma <- matrix(
c(
sd_x^2,
correlation * sd_x * sd_y,
correlation * sd_x * sd_y,
sd_y^2
),
ncol = 2
)
library(MASS)
simulated_data <- mvrnorm(n, mu = mean, Sigma = sigma)
x <- simulated_data[, 1]
y <- simulated_data[, 2]
# Plot the data
plot(simulated_data)
# Compute entropy using normal entropy formula
cov_matr <- cov(cbind(x, y))
sigmas <- diag(cov_matr)
det_sig <- prod(sigmas)
normal_entropy
is a function that computes the entropy
of a bivariate normal distribution given the number of dimensions
k
, the value of \(\pi\),
and the determinant of the covariance matrix det_sig
. This
is used, for example, in Maei et al.
(2009) to compute the entropy of mice trajectories in a Morris
water maze.
normal_entropy <- function(k, pi, det_sig) {
(k / 2) * (1 + log(2 * pi)) + (1 / 2) * log(det_sig)
}
entropia <- normal_entropy(k = 2, pi = pi, det_sig)
print(entropia) # Expected value close to 4.3997
## [1] 4.399979
# Compute entropy using entropy_kde2d
result <- entropy_kde2d(x, y, n_grid = 50)
print(result) # Expected value close to 4.2177
## [1] 4.217723
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.