This paper presents a parsimonious method of factor analysis that requires the final determination of many fewer parameters than unrestricted maximum likelihood factor analysis, while achieving almost as close a fit to the sample correlation matrix. Because unrestricted analysis estimates as many parameters as possible, (np- p(p-1)/2, where p is the number of factors and n is the number of variables), it violates Occam's razor, that the number of explanatory entities (parameters) must not be needlessly multiplied. Not controlling the number of parameters is not a mathematical error, since the excessive number of parameters do exist in unrestricted parameter space. However, it is a statistical error because there is no statistical justification for so many parameters. Indeed, this large number of parameters, while truly providing a closer fit to the sample correlation matrix, can never be judged a significant improvement in fit over parsimonious factor analysis. Parsimonious factor analysis (PFA), is an empirically restricted method that reduces the number of parameters to be determined to p(p+1)/2 parameters by using known constants in modeling the principal factor matrix. The constants are the normed eigenvectors of the sample correlation matrix, which are obtained tautologically from its eigenvalue- eigenvector decomposition; they do not use up any freedom. With this method, a misleading tradition of over 30 years' standing in classical factor analysis can be abandoned, and parsimonious factor analysis can be seen as a superior approach, statistically correct but not numerically very different from classical factor analysis. Indeed, it expands factor analysis so that now sample correlation matrices whose rank is less than their order can be analyzed, which was never previously possible.