Storage shares rebounded on Friday after sharp declines in prior trading session. SanDisk up over 3%; Micron, Western Digital up nearly 2%; Seagate Tech up about 1%.
Alphabet revealed a set of new algorithms on Wednesday designed to reduce the amount of memory needed to run large language models and vector search engines.
The algorithms introduced by Google include TurboQuant, Quantized Johnson-Lindenstrauss, and PolarQuant. TurboQuant is a compression algorithm that optimally addresses the challenge of memory overhead in vector quantization.
A mathematical technique known as the Johnson-Lindenstrauss Transform shrinks complex, high-dimensional data while preserving the essential distances and relationships between data points. This algorithm creates a high-speed shorthand that requires zero memory overhead.
Finally, PolarQuant addresses the memory overhead problem by using polar coordinates. This allows LLMs to skip the data normalization step because it maps data onto a fixed, predictable “circular” grid where the boundaries are already known.
"As AI becomes more integrated into all products, from LLMs to semantic search, this work in fundamental vector quantization will be more critical than ever," said Google research scientist Amir Zandieh and Vahab Mirrokni, VP and Google Fellow, in a blog post.
Google plans to present TurboQuant at the International Conference on Learning Representations in Rio de Janeiro in April.
Comments