Accurate prediction of the state of health (SOH) of lithium-ion batteries in electric vehicles is crucial for ensuring the safety of drivers and passengers, optimizing battery management systems (BMS), and extending battery life. Reliable SOH prediction is the foundation of BMS basic functions, including charge and discharge control, remaining useful life (RUL) prediction, and fault diagnosis. However, existing data-driven methods still face two long-standing challenges. First, most models rely excessively on a large number of handcrafted health features derived from voltage, current, and capacity data, resulting in feature redundancy and low computational efficiency. Second, SOH-related time series exhibits strong nonlinearity and non-stationarity, leading traditional neural networks to experience prediction drift, instability, and performance degradation under varying conditions. To address the first challenge, we propose a lightweight health feature selection mechanism that combines incremental capacity analysis (ICA) with correlation analysis to automatically identify compact and physically meaningful degradation features. Only four key health features that are highly correlated with capacity fading are selected, which effectively reduces model complexity and computational cost while maintaining high SOH prediction accuracy. To tackle the second challenge, we develop a hybrid neural network model called KanFormer, which integrates the Kolmogorov-Arnold (KAN) representation theory with a Transformer-based temporal modeling framework to enable accurate and robust SOH prediction. Specifically, the proposed KanFormer framework consists of three hierarchical modules: 1) the local feature extraction module, which leverages the smooth interpolation capability of KAN to capture fine-grained degradation characteristics from voltage-capacity and incremental capacity (IC) curves, modeling local nonlinear behaviors in the degradation process; 2) the global feature extraction module, which employs a multi-head Transformer encoder to learn long-range dependencies and cross-scale temporal relationships, enabling the joint modeling of short-term dynamics and long-term aging evolution; 3) the prediction output module, which uses nonlinear KAN layers to adaptively fuse local and global representations, producing numerically stable and highly accurate SOH prediction results. By combining the mathematical expressiveness of KAN with the temporal reasoning capability of the Transformer, KanFormer effectively mitigates prediction drift and oscillations induced by data nonlinearity and non-stationarity. Compared with traditional deep-learning models, the proposed method improves training efficiency by 15.32%. Experimental validation on three publicly available battery-aging datasets (Michigan Formation, HNEI, and NASA) demonstrates its superior performance, achieving MSE = 0.0045, MAE = 0.041, and
R2 = 0.978 on the Michigan dataset, MSE = 0.00055, MAE = 0.0175,
R2 = 0.996 on the HNEI dataset, and MSE = 0.0056, MAE = 0.017,
R2 = 0.984 on the NASA dataset. These results substantially outperform mainstream baselines, confirming the high accuracy and robustness of KanFormer. In summary, KanFormer combines lightweight feature selection, nonlinear functional representation, and cross-scale temporal modeling, providing a scalable and interpretable solution for predicting high-accuracy and high-efficiency SOH.