Quantum estimation plays an important role in quantum information science and is especially used for confirming a successful experimental implementation of quantum protocols. Main goals of the field are to clarify the behavior of estimation errors for finite data and to find how to improve them, but these goals have not been achieved completely, even for the relatively simple problem of one-qubit state estimation. In this talk, we analyze the behavior of estimation errors evaluated by two loss functions, the Hilbert-Schmidt distance and infidelity, in one-qubit state tomography with finite data, and improve these estimation errors by using an adaptive design of experiment. First, we derive the explicit form of a function reproducing the behavior of the estimation errors for finite data by introducing two approximations: a Gaussian approximation of the multinomial distribution of outcomes, and a linearization of the boundary. Second, in order to reduce estimation errors, we consider an estimation scheme adaptively updating measurements according to previously obtained outcomes and measurement settings. Updates are determined by the average-variance-optimality (A-optimality) criterion, known in the classical theory of experimental design and applied here to quantum state estimation. We compare numerically two adaptive and two nonadaptive schemes for finite data sets and show that the A-optimality criterion gives more precise estimates than standard quantum tomography.