Dr. Wolfram is quite busy, especially with AI stuff nowadays. We check in with his team (Wolfram Blockchain) once in a while to explore potential collaboration. But there is no immediate activities right now.
How can you make a portal for app developers to host their Ai inferences on your network? There are open source models that can be hosted like LLama or Grok or others
LLM inference (esp smaller models) can be run on modest hardware, e.g. $350 mini-PC with AMD APU. Therefore, it is feasible to do distributd inference on NKN nodes if the nodes have the processing capability. The main issue is that most of NKN nodes today are low-cost virtual machines that have very limited CPU and RAM, and no GPU at all. This is totally ok just for networking like NKN Node, but not enough for even 1B/3B LLM inference. Our CTO already have written an ariticle on how to do it over NKN network. If there is enough interest, we can investigate further.
Обсуждают сегодня