Skip to content

[Arxiv] Code repo for the paper entitled Task Vector Bases: A Unified and Scalable Framework for Compressed Task Arithmetic"

Notifications You must be signed in to change notification settings

uiuctml/TaskVectorBasis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Task Vector Basis

[TODO] New codebase coming soon.

This is the official repository that implements the bases algorithms in Task Vector Bases: A Unified and Scalable Framework for Compressed Task Arithmetic.

Introduction

Task arithmetic, representing downstream tasks through linear operations on task vectors, has emerged as a simple yet powerful paradigm for transferring knowledge across diverse settings. However, maintaining a large collection of task vectors introduces scalability challenges in both storage and computation. We propose Task Vector Bases, a framework compressing $T$ task vectors into $M < T$ basis vectors while preserving the functionality of task arithmetic. By representing each task vector as a structured linear combination of basis atoms, our approach supports standard operations such as addition, negation, as well as more advanced arithmetic ones. The framework is orthogonal to other efficiency-oriented improvements in task arithmetic and can be used in combination with them. We provide theoretical analysis showing that basis compression retains addition generalization guarantees and enables principled unlearning, with error bounds depending on reconstruction quality. Empirically, our proposed basis construction methods consistently outperform heuristic basis construction baselines and, in some cases, even surpass the performance of full task vector collections across diverse downstream applications while reducing storage and computational requirements.

Installation

To run the code, please install all its dependencies:

git clone https://github.com/uiuctml/TaskVectorBasis.git
cd TaskVectorBasis
conda env create -n task-vector-basis --file environment.yml

Datasets

To download and prepare the vision datasets, please follow the instructions in this issue. For 12-task language datasets, please follow LM-BFF-main to prepare the data.

Vision Experiments

Please follow the instructions in MERS_exp.ipynb to prepare for inner merging results. Then, if the outer merging method is task-arithmetic, use bash cmd/run_outerTA.sh.

Medium-scale Natural Language Experiments

Once fine-tuning is finished (see Localize-and-Stitch), run L&S/language/clustering.ipynb to inspect the clustering result. Then to bases-L&S methods can be by running:

  • SharedMask: python main_localize_stitch_SharedMask.py
  • StitchTwice: python main_localize_stitch_StitchTwice.py

To test these two methods on new tasks, please edit the highlighted TODO lines to reflect the cluster result updates.

Large-scale Natural Language Experiments

To finetune on one single dataset:

python src/nlp_mlm.py --dataset <YOUR TASK NAME> --main train --seed 0

To run basis addition, first run the clustering result:

python src/nlp_merge.py --seed 0

then run bad task filtering and merge finally:

python src/nlp_mlm.py --main eval_majority --seed 0
python src/nlp_mlm.py --main basis_merging --merge_method weight_averaging --seed 0

Acknowledgement

Our repository is built upon tangent_task_arithmetic, Localize-and-Stitch.

Reference

If you find this code useful, please cite the following paper:

@article{zeng2025efficient,
  title={Efficient Model Editing with Task Vector Bases: A Theoretical Framework and Scalable Approach},
  author={Zeng, Siqi and He, Yifei and You, Weiqiu and Hao, Yifan and Tsai, Yao-Hung Hubert and Yamada, Makoto and Zhao, Han},
  journal={arXiv preprint arXiv:2502.01015},
  year={2025}
}

Contact

Feel free to open Issues or contact siqi6@illinois.edu for any questions or comments.

About

[Arxiv] Code repo for the paper entitled Task Vector Bases: A Unified and Scalable Framework for Compressed Task Arithmetic"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published