Spare & Sparse

Sparse; a borrowing from Latin. Etymons: Latin sparsus, spargĕre. Etymology: Latin sparsus, past participle of spargĕre to scatter. Compare Italian; sparso.

The OED hasn’t yet caught up with sparsity, as used in the tech community. In itself, an interesting phrase, given that the ‘tech community’ is everywhere, all the time, not just hanging out in West Coast coffee shops and startup spaces, talking to VCs.

Sparsity

In AI inference and machine learning, sparsity refers to a matrix of numbers within a dataset that includes many zeros or values that will not significantly impact a calculation. Researchers in machine learning are trying to accelerate AI using sparsity, pulling out of a neural network as many unneeded parameters as possible — without unraveling AI’s accuracy. The goal is to reduce the mounds of matrix multiplication deep learning requires, shortening the time to good results. So far, there have been no big winners. To date, researchers have tried a variety of techniques to pull out as many as 95 percent of the weights in a neural network. But then, spending more time than they’d saved, they’re forced to craft heroic steps to claw back the accuracy of the streamlined model. And the steps that have worked for one model haven’t worked on others.

This is an adapted quote from the Nvidia website. By this definition, a spare website such as this has had successful sparsification.