Abstract:In high-dimensional data processing, the K-nearest neighbor (KNN) graph is a critical data structure widely used in tasks such as clustering, graph neural networks, and recommendation systems. However, with the increasing use of pretrained embedding models in unstructured data modeling and retrieval, embedding model fine-tuning has become a key step in enhancing the semantic representation capability of embeddings. Such fine-tuning often leads to systematic changes in the vector representations of all data points, which invalidates the original neighborhood relationships in the KNN graph. Existing research primarily focuses on building KNN graphs for static data, lacking efficient solutions for adapting to updated embeddings after fine-tuning. To address this gap, this study proposes FastAdjust, an efficient KNN graph update method tailored for embedding model fine-tuning scenarios. Leveraging the observation that fine-tuning usually causes only minor changes to individual embeddings, incremental adjustments to the original KNN graph are performed by FastAdjust through a local update strategy, significantly improving update efficiency while maintaining graph quality. Specifically, FastAdjust first employs a clustering structure based on product quantization to efficiently and accurately locate a subset of candidate neighbors for each data point, thus narrowing the search space. Secondly, based on data density and the magnitude of embedding variation, FastAdjust leverages their correlation with changes in the KNNs to adaptively allocate update resources according to the degree of neighbor relationship changes, thus improving overall update efficiency. Experimental results on real-world datasets demonstrate that FastAdjust efficiently and accurately adapts KNN graphs to embedding updates with significantly reduced computational cost, showing strong practical value and scalability.