From 16d5d89f503326e951b88a2eb13b54a78a984fcd Mon Sep 17 00:00:00 2001 From: Peilin Li Date: Sat, 20 Dec 2025 13:44:35 +0800 Subject: [PATCH] [docs]: Update Python version options in DPO tutorial (#1734) --- doc/en/DPO_tutorial.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/en/DPO_tutorial.md b/doc/en/DPO_tutorial.md index cc21cd1..ad07261 100644 --- a/doc/en/DPO_tutorial.md +++ b/doc/en/DPO_tutorial.md @@ -7,7 +7,7 @@ This tutorial demonstrates how to use Direct Preference Optimization (DPO) to fi ### Step 1: Create a conda environment and suit it for KTransformers ```Bash -conda create -n Kllama python=3.12 # choose from : [3.10, 3.11, 3.12, 3.13] +conda create -n Kllama python=3.12 # choose from : [3.11, 3.12, 3.13] conda install -y -c conda-forge libstdcxx-ng gcc_impl_linux-64 conda install -y -c nvidia/label/cuda-12.8.0 cuda-runtime ```