Over a year ago, Microsoft announced that it is working with hardware vendors to offer GPU-accelerated training of machine learning (ML) models on Windows Subsystem for Linux (WSL). A preview for this ...
If you're looking at using Ollama on your PC to run local LLMs (Large Language Models), with Windows PCs at least, you have two options. The first is to just use the Windows app and run it natively.
When Microsoft slipped the first public preview of the Windows Subsystem for Linux (WSL) into the Windows 10 Anniversary Update in August 2016, it mostly appeared to be a niche convenience aimed at ...