Modern Macs, in part because of the unified memory architecture of Apple’s modern System-on-a-Chip platform, are a favorite among developers who want to use large language models (LLMs) locally. That’s great during the development process — and also simply fun to try out — but very few companies then deploy their models on Apple Silicon. For a while now, webAI has focused on bringing machine learning and small generative AI models to Apple devices, both phones and desktops.
Now the company is taking this a step further thanks to a…