Skip to content

Use torchtune 0.6.1 #10792

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
May 9, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 0 additions & 4 deletions .ci/scripts/test_model.sh
Original file line number Diff line number Diff line change
Expand Up @@ -87,10 +87,6 @@ test_model() {
bash examples/models/llava/install_requirements.sh
STRICT="--no-strict"
fi
if [[ "$MODEL_NAME" == "llama3_2_vision_encoder" || "$MODEL_NAME" == "llama3_2_text_decoder" ]]; then
# Install requirements for llama vision.
bash examples/models/llama3_2_vision/install_requirements.sh
fi
if [[ "${MODEL_NAME}" == "qwen2_5" ]]; then
# Install requirements for export_llama
bash examples/models/llama/install_requirements.sh
Expand Down
3 changes: 0 additions & 3 deletions .ci/scripts/unittest-linux.sh
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,6 @@ if [[ "$BUILD_TOOL" == "cmake" ]]; then
CMAKE_ARGS="-DEXECUTORCH_BUILD_PYBIND=ON -DEXECUTORCH_BUILD_XNNPACK=ON -DEXECUTORCH_BUILD_KERNELS_QUANTIZED=ON" \
.ci/scripts/setup-linux.sh "$@"

# Install llama3_2_vision dependencies.
PYTHON_EXECUTABLE=python ./examples/models/llama3_2_vision/install_requirements.sh

.ci/scripts/unittest-linux-cmake.sh
elif [[ "$BUILD_TOOL" == "buck2" ]]; then
# Removing this breaks sccache in the Buck build, apparently
Expand Down
1 change: 0 additions & 1 deletion .ci/scripts/unittest-macos.sh
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ if [[ "$BUILD_TOOL" == "cmake" ]]; then
# Install llama3_2_vision dependencies.
PYTHON_EXECUTABLE=python \
${CONDA_RUN} --no-capture-output \
./examples/models/llama3_2_vision/install_requirements.sh

.ci/scripts/unittest-macos-cmake.sh
elif [[ "$BUILD_TOOL" == "buck2" ]]; then
Expand Down
4 changes: 0 additions & 4 deletions backends/arm/test/test_arm_baremetal.sh
Original file line number Diff line number Diff line change
Expand Up @@ -83,8 +83,6 @@ test_pytest_ops() { # Test ops and other things
test_pytest_models() { # Test ops and other things
echo "${TEST_SUITE_NAME}: Run pytest"

examples/models/llama3_2_vision/install_requirements.sh

# Prepare for pytest
backends/arm/scripts/build_executorch.sh

Expand Down Expand Up @@ -117,8 +115,6 @@ test_pytest_ops_ethosu_fvp() { # Same as test_pytest but also sometime verify us
test_pytest_models_ethosu_fvp() { # Same as test_pytest but also sometime verify using Corstone FVP
echo "${TEST_SUITE_NAME}: Run pytest with fvp"

examples/models/llama3_2_vision/install_requirements.sh

# Prepare Corstone-3x0 FVP for pytest
backends/arm/scripts/build_executorch.sh
backends/arm/scripts/build_portable_kernels.sh
Expand Down
13 changes: 0 additions & 13 deletions examples/models/llama3_2_vision/install_requirements.sh

This file was deleted.

1 change: 1 addition & 0 deletions requirements-examples.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@
# TODO: Make each example publish its own requirements.txt
timm == 1.0.7
torchsr == 1.0.4
torchtune >= 0.6.1
transformers ==4.47.1
Loading