Bayesian Knowledge Tracing

probabilistic model

  1. pL – latent (mastery)

  2. pT – transition (learning)

  3. pG – guess

  4. pS – slip

  5. Learning order (K)

  6. Problem difficulty

  7. Prior knowledge (initial assessment + sequential)

  8. *Learning rate/speed (derivatives / ODE or PDE)

Equation(a):p(L1)uk=p(L0)kEquation (a): {\displaystyle p(L_{1})_{u}^{k}=p(L_{0})^{k}}
#@title Initialize Parameters

import numpy as np

def initialize_parameters(pL, pT, pS, pG):
  
  np.random.seed(1)

  pL =  np.random.randn(1)
  pT =  np.random.randn(1)
  pS =  np.random.randn(1)
  pG =  np.random.randn(1)

  parameters = {
      'pL': pL,
      'pT': pT,
      'pS': pS,
      'pG': pG
  }

  return parameters
Equation(b):p(Ltobs=correct)uk=p(Lt)uk(1p(S)k)p(Lt)uk(1p(S)k)+(1p(Lt)uk)p(G)kEquation (b): {\displaystyle p(L_{t}|obs=correct)_{u}^{k}={\frac {p(L_{t})_{u}^{k}\cdot (1-p(S)^{k})}{p(L_{t})_{u}^{k}\cdot (1-p(S)^{k})+(1-p(L_{t})_{u}^{k})\cdot p(G)^{k}}}}
Equation(c):p(Ltobs=wrong)uk=p(Lt)ukp(S)kp(Lt)ukp(S)k+(1p(Lt)uk)(1p(G)k)Equation (c): {\displaystyle p(L_{t}|obs=wrong)_{u}^{k}={\frac {p(L_{t})_{u}^{k}\cdot p(S)^{k}}{p(L_{t})_{u}^{k}\cdot p(S)^{k}+(1-p(L_{t})_{u}^{k})\cdot (1-p(G)^{k})}}}
Equation(d):p(Lt+1)uk=p(Ltobs)uk+(1p(Ltobs)uk)p(T)kEquation (d): {\displaystyle p(L_{t+1})_{u}^{k}=p(L_{t}|obs)_{u}^{k}+(1-p(L_{t}|obs)_{u}^{k})\cdot p(T)^{k}}
Equation(e):p(Ct+1)uk=p(Lt+1)uk(1p(S)k)+(1p(Lt+1)uk)p(G)kEquation (e): {\displaystyle p(C_{t+1})_{u}^{k}=p(L_{t+1})_{u}^{k}\cdot (1-p(S)^{k})+(1-p(L_{t+1})_{u}^{k})\cdot p(G)^{k}}

Last updated

Was this helpful?