A Small Footprint i-Vector Extractor
Patrick Kenny |
---|
Both the memory and computational requirements of algorithms traditionally used to extract i-vectors at run time and to train i-vector extractors off-line scale quadratically in the i-vector dimensionality. We describe a variational Bayes algorithm for calculating i-vectors exactly which converges in a few iterations and whose computational and memory requirements scale linearly rather than quadratically. For typical i-vector dimensionalities, the computational requirements are greater than those of the traditional algorithm but still quite modest when compared with the cost of extracting Baum-Welch statistics. The run time memory requirement is scarcely greater than that needed to store the eigenvoice basis. The variational Bayes algorithm enables the construction of i-vector extractors of higher dimensionality than has previously been envisaged. We show that modest gains in speaker verification accuracy (as measured by the 2010 NIST detection cost function) can be achieved using high dimensional i-vectors.