In the present paper, we give an investigation on the learning rate of l2-coefficient regularized classification with strong loss and the data dependent kernel functional spaces. The results show that the learning rate is influenced by the strong convexity.
We analyze the learning rates for the least square regression with data dependent hy- pothesis spaces and coefficient regularization algorithms based on general kernels. Under a very mild regularity condition on the regression function, we obtain a bound for the approximation error by esti- mating the corresponding K:-functional. Combining this estimate with the previous result of the sample error, we derive a dimensional free learning rate by the proper choice of the regularization parameter.