Compute global attention weights and context vectors for time series

computeattention(
  series,
  attention_type = "cosine",
  window_size = 3,
  decay_factor = 5,
  temperature = 1,
  sigma = 1,
  sensitivity = 1,
  alpha = 0.5,
  beta = 0.5
)

Arguments

series

Numeric vector containing the time series of length n

attention_type

String specifying the type of attention mechanism to use. Options are: "cosine", "exponential", "dot_product", "scaled_dot_product", "gaussian", "linear", "value_based", "hybrid", "parametric". Default is "cosine".

window_size

Integer parameter for window size (applicable for "cosine" attention).

decay_factor

Double for decay factor (applicable for "exponential" attention).

temperature

Double for temperature (applicable for "scaled_dot_product" attention).

sigma

Double for sigma (applicable for "gaussian" attention).

sensitivity

Double for sensitivity (applicable for "value_based" or "hybrid" attention).

alpha

Double for alpha (applicable for "parametric" attention).

beta

Double for beta (applicable for "parametric" attention).

Value

List containing:

attention_weights

n × n matrix where entry (i,j) represents the attention weight of time j on time i. Only entries j <= i are non-zero (causal attention).

context_vectors

Vector of length n where each entry i is the weighted sum of all values up to time i, using the attention weights.

Examples

# For a series of length 5 using "cosine" attention
series <- c(1, 2, 3, 4, 5)
result <- computeattention(series, attention_type = "cosine", window_size = 3)

# attention_weights will be 5x5 matrix
# context_vectors will be length 5
dim(result$attention_weights)  # [1] 5 5
#> [1] 5 5
length(result$context_vectors) # [1] 5
#> [1] 5