In multiclass classification, one faces greater uncertainty when the data fall near the decision boundary. To reduce the uncertainty, one can wait and collect more data, but this invariably delays the decision. How can one make an accurate classification as quickly as possible? The solution requires a multiclass generalization of Wald’s sequential hypothesis testing, but the standard formulation is intractable because of the curse of dimensionality in dynamic programming. In “Optimal Sequential Multiclass Diagnosis,” Wang shows that, in a broad class of practical problems, the reachable state space is often restricted on, or near, a set of low-dimensional, time-dependent manifolds. After understanding the key drivers of sparsity, the author develops a new solution framework that uses a low-dimensional statistic to reconstruct the high-dimensional state. This framework circumvents the curse of dimensionality, allowing efficient computation of the optimal or near-optimal policies for quickest classification with large numbers of classes.