Skip to content
/ MPT Public

Markovian Pre-Trained Transformer for Next-Item Recommendation

License

Notifications You must be signed in to change notification settings

BDML-lab/MPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logo

Markovian Pre-trained Transformer for Next-Item Recommendation: ✅ 100% pre-trained on synthetic Markov chains; ✅ better transferability.

VS

📊 Experimental Results

results

⚙️ Requirements

🚀 Usage

┌── data # the 'root' path of data
│	├── Processed
│	│	├── Amazon2014Beauty_550_LOU # the training data
│	│	└── ...
│	├── Amazon2014Beauty.zip # the raw data
│	└── ...
|
├── logs # training logs
|
├── models # saving pre-trained models: e.g., sentence-t5-xl
|
├── configs
│	├── finetune.yaml # config for fine-tuning
│	└── pretrain.yaml # config for pre-training
|
├── encode.py # encoding item features
|
├── finetune.py
├── pretrain.py
|
└── sampler.py # sampling Markov trajectories
overview

Markovian Pre-Training

python pretrain.py --config configs/pretrain.yaml --alpha 0.05 --num-states 30

Tip

The pre-trained models are stored in the logs/... directory.

Recommendation Fine-Tuning

  • Adaptor:
    python finetune.py --config configs/finetune.yaml --dataset Amazon2014Beauty_550_LOU --path logs/...
  • +LoRA:
    python finetune.py --config configs/finetune.yaml --adaptor-only False --dataset Amazon2014Beauty_550_LOU --path logs/...

Note

To reproduce the results presented in the paper, one should follow the steps outlined in data/README.md and models/README.md to download the processed datasets and pre-trained models.

Acknowledgements

  1. Simon-Lepage/MarkovICL: We sincerely thank Simon Lepage for sharing the code.

Citation

@article{xu2025mpt,
  title={Markovian pre-trained transformer for next-item recommendation},
  author={Xu, Cong and Li, Guoliang and Wang, Jun and Zhang, Wei},
  journal={arXiv preprint arXiv:2601.08275},
  year={2026}
}

About

Markovian Pre-Trained Transformer for Next-Item Recommendation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages