
"Despite ongoing speculation around an investment bubble that may be set to burst, artificial intelligence (AI) technology is here to stay. And while an over-inflated market may exist at the level of the suppliers, AI is well-developed and has a firm foothold among organisations of all sizes. But AI workloads place specific demands on IT infrastructure and on storage in particular. Data volumes can start big and then balloon, in particular during training phases as data is vectorised and checkpoints are created."
"During training, processing focuses on what is effectively pattern recognition. Large volumes of data are examined by an algorithm - likely part of a deep learning framework like TensorFlow or PyTorch - that aims to recognise features within the data. This could be visual elements in an image or particular words or patterns of words within documents. These features, which might fall under the broad categories of "a cat" or "litigation", for example, are given values and stored in a vector database."
AI is widely adopted across organisations despite market speculation. AI workloads split into training and inference stages. Training performs pattern recognition across large datasets using deep learning frameworks such as TensorFlow or PyTorch. Features are extracted, vectorised and stored in vector databases; complex concepts are represented as combinations of discrete values (for example, a tortoiseshell cat comprises values for cat and tortoiseshell). Data volumes can start large and then balloon during training as vectors and checkpoints are produced. Inference applies trained models to production data. These workflows impose high storage I/O, throughput and capacity demands and require careful data curation and lifecycle management.
 Read at ComputerWeekly.com
Unable to calculate read time
 Collection 
[
|
 ... 
]