Skip to content

Phase 2 · Model Modification

Priority: High Status: Pending Depends on: Phase 1 (event encoder)

  • Kronos forward() at kronos.py:239 uses additive temporal embedding pattern
  • Temporal embedding added at line 257: x = x + time_embedding
  • Event embedding follows identical pattern — additive to token embedding

Add EventEmbedding module to Kronos predictor. Minimal code change: new embedding layer + 3 lines in forward(). BSQ tokenizer, HierarchicalEmbedding, DualHead, and temporal embeddings remain frozen.

  • Accept optional (B, T, 20) event tensor in forward(), decode_s1(), decode_s2()
  • Project event features to d_model dimensions via linear layer
  • Add event embedding to combined embedding (same as temporal)
  • Maintain backward compatibility — events=None produces identical output to base model
  • Zero overhead when events=None (skip projection entirely)
  • New params: ~5.2K (Linear(20, d_model=256)) — negligible VRAM increase
class EventEmbedding(nn.Module):
def __init__(self, num_event_channels=20, d_model=256):
super().__init__()
self.proj = nn.Linear(num_event_channels, d_model)
nn.init.xavier_uniform_(self.proj.weight)
nn.init.zeros_(self.proj.bias)
def forward(self, events):
# events: (B, T, 20) or None
if events is None:
return 0 # additive identity
return self.proj(events.float())
# Current (lines 254-258):
x = self.embedding([s1_ids, s2_ids])
if stamp is not None:
time_embedding = self.time_emb(stamp)
x = x + time_embedding
x = self.token_drop(x)
# Modified:
x = self.embedding([s1_ids, s2_ids])
if stamp is not None:
time_embedding = self.time_emb(stamp)
x = x + time_embedding
event_embedding = self.event_emb(events) # NEW — returns 0 if events is None
x = x + event_embedding # NEW
x = self.token_drop(x)
  • Accept optional events parameter (B, T, 10) for context + (B, pred_len, 10) for prediction window
  • Concatenate context + prediction event tensors aligned with full_stamp
  • Pass events to model.decode_s1() and model.decode_s2()
  1. Add EventEmbedding class to kronos-service/kronos_lib/model/module.py
  2. Add self.event_emb = EventEmbedding(20, d_model) to Kronos.__init__()
  3. Modify Kronos.forward() — add event embedding injection
  4. Modify Kronos.decode_s1() — pass events through
  5. Modify Kronos.decode_s2() — unchanged (receives context from decode_s1)
  6. Modify auto_regressive_inference() — accept and route events
  7. Modify KronosPredictor.predict() — accept events DataFrame
  8. Test: verify events=None produces identical output to base model (regression check)
  • Modify: kronos-service/kronos_lib/model/module.py — add EventEmbedding class
  • Modify: kronos-service/kronos_lib/model/kronos.py — modify Kronos class + inference
  • model.forward(s1, s2, stamp, events=None) produces identical output to unmodified Kronos
  • model.forward(s1, s2, stamp, events=tensor) produces different output
  • No VRAM increase when events=None
  • All existing tests pass without modification (backward compatible)
  • New unit test: verify event embedding gradient flows during training