[1] NIE F P, LI Z H, WANG R, et al. An effective and efficient algorithm for K-means clustering with new formulation[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(4): 3433-3443.[2] BAI L, LIANG J Y, ZHAO Y X. Self-constrained spectral clustering[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(4): 5126-5138.
[3] ZHANG Q H, DAI Y Y, WANG G Y. Density peaks clustering based on balance density and connectivity[J]. Pattern Recognition, 2023, 134: 109052.
[4] HIRECHE C, DRIAS H, MOULAI H. Grid based clustering for satisfiability solving [ J]. Applied Soft Computing,2020, 88: 106069.
[5] MONATH N, KOBREN A, KRISHNAMURTHY A, et al. Scalable hierarchical clustering with tree grafting[C]∥Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York:ACM, 2019: 1438-1448.
[6] WAN Y C, LIU X B, WU Y, et al. ICGT: a novel incremental clustering approach based on GMM tree[ J].Data & Knowledge Engineering, 2018, 117: 71-86.
[7] 鲁斌, 范晓明. 基于改进自适应k 均值聚类的三维点云骨架提取的研究[J]. 自动化学报, 2022, 48(8):1994-2006.LU B, FAN X M. Research on 3D point cloud skeleton extraction based on improved adaptive k-means clustering[ J]. Acta Automatica Sinica, 2022, 48 ( 8): 1994-2006.
[8] LI Z, TANG C, ZHENG X, et al. Unified K-means coupledself-representation and neighborhood kernel learning for clustering single-cell RNA-sequencing data[J]. Neurocomputing, 2022, 501: 715-726.
[9] IM S, QAEM M M, MOSELEY B, et al. Fast noise removal for K-means clustering[C]∥ the 23rd International Conference on Artificial Intelligence and Statistics(AISTATS) 2020. Palermo: AISTATS, 2020: 456-466.[10] LI Y M, ZHANG Y, TANG Q T, et al. T-k-means: a robust and stable k-means variant[C]∥International Conferenceon Acoustics, Speech and Signal Processing (ICASSP). Piscataway: IEEE, 2021: 3120-3124.
[11] GRUNAU C, ROZHON V. Adapting K-means algorithms for outliers[C] ∥the 39th International Conference on Machine Learning. Baltimore: ICML, 2022: 7845-7886.
[12] ZHANG Z, FENG Q L, HUANG J Y, et al. A local search algorithm for k-means with outliers[J]. Neurocomputing,2021, 450: 230-241.
[13] HUANG S D, REN Y Z, XU Z L. Robust multi-view data clustering with multi-view capped-norm K-means[ J]. Neurocomputing, 2018, 311: 197-208.
[14] HUANG S D, KANG Z, XU Z L, et al. Robust deep kmeans: an effective and simple method for data clustering[J]. Pattern Recognition, 2021, 117: 107996.
[15] HAUTAMÄKI V, CHEREDNICHENKO S, KÄRKKÄINENI, et al. Improving k-means by outlier removal[C]∥Proceedings of the 14th Scandinavian conference on Image Analysis.New York: ACM, 2005: 978-987.
[16] PENG D W, CHEN Z Z, FU J C, et al. Fast k-means clustering based on the neighbor information[C]∥ISEEIE 2021: 2021 International Symposium on Electrical, Electronics and Information Engineering. New York: ACM,2021: 551-555.
[17] GIFFON L, EMIYA V, KADRI H, et al. Quick-means: accelerating inference for K-means by learning fast transforms[J]. Machine Learning, 2021, 110(5): 881-905.
[18] HAMERLY G. Making k-means even faster [ J]. Proceedings of the 10th SIAM International Conference on Data Mining, 2010: 130-140.
[19] DRAKE J. Faster K-means clustering[EB/ OL]. (2013-09 - 24) [ 2023 - 06 - 13]. http: ∥hdl. handle. net/2104/ 8826.
[20] NEWLING J, FLEURET F. Fast K-means with accurate bounds[C]∥Proceedings of the 33rd International Conferenceon International Conference on Machine Learning-Volume 48. New York:ACM, 2016: 936-944.
[21] XIA S Y, PENG D W, MENG D Y, et al. Ball $ k $ kmeans: fast adaptive clustering with No bounds [ J].IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(1): 87-99.
[22] ARTHUR D, VASSILVITSKII S. K-means + +: the advantages of careful seeding[C]∥Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms.New York: ACM, 2007: 1027-1035.
[23] KAUFMAN L, ROUSSEEUW P. Clustering by means of medoids[EB/ OL]. (1987-01-01) [2023-06-13]. https:∥ www. researchgate. net/ publication/ 243777819 _Clustering_by_Means_of_Medoids.