Instead of quantizing the input samples one-by-one, we take n input samples: X=[x1,x2,…,xn]∈Rn
in other words: we quantize vectors, not scalars
the rate of the source code is R=nlog2K bits / source sample
Rn is partitioned into Kregions, {Ri}i=1K and Q(X)=X^i,∀X∈Ri
{Ri}i=1K is the set of Voronoi regions
The Voronoi regions comprise the (Euclidean) space wherein a given vector, X is closer to a specific codeword / centroid, i, than any other, j=iℜi={X∈Rn∣X−Xi<X−Xj, for all i=j}
Optimal VQ: X^i is the centroid of the Voronoi region, Ri:
X^i:=E[X∣X∈Ri]
A vector quantizing maps n-dimensional vectors X∈Rn to a finite set of vectors
Each vector is called a code vector or codeword, and the set of all codewords, X is called a codebook
Associated with each codeword, X^i
Can obtain codewords via e.g. K-means clustering (Lloyd’s algorithm)