Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Read our Comment FAQ before commenting.

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Marcos Alano
Marcos Alano
3 hours ago

Thanks. I just found out onnxruntime.ai can run on NPU. I’m excited for the day where Ollama and llama.cpp will be able to use the NPU.