Alibaba Cloud: AI Models, Reducing Footprint of Nvidia GPUs, and Cloud Streaming boilingsteam.com 13 points by ekianjo 4 months ago
Havoc 4 months ago >they already have developed their own CUDA-like support, that is not 100% compatible, but works well enough for their internal uses at Alibaba.I wonder whether that's inference or training. ekianjo 4 months ago From what I could gather this was mostly for inference.
daft_pink 4 months ago not sure about everyone else, but i can’t justify sending my data workflows to china. rightbyte 4 months ago Do you imply you can justify sending your data anywhere else? daft_pink 4 months ago Unfortunately, it’s not affordable to run your own LLM yet, but since I work in a regulated industry, it can’t go to China.But you are correct the moment it becomes practical to do it in-house. It will be done that way.
rightbyte 4 months ago Do you imply you can justify sending your data anywhere else? daft_pink 4 months ago Unfortunately, it’s not affordable to run your own LLM yet, but since I work in a regulated industry, it can’t go to China.But you are correct the moment it becomes practical to do it in-house. It will be done that way.
daft_pink 4 months ago Unfortunately, it’s not affordable to run your own LLM yet, but since I work in a regulated industry, it can’t go to China.But you are correct the moment it becomes practical to do it in-house. It will be done that way.
>they already have developed their own CUDA-like support, that is not 100% compatible, but works well enough for their internal uses at Alibaba.
I wonder whether that's inference or training.
From what I could gather this was mostly for inference.
not sure about everyone else, but i can’t justify sending my data workflows to china.
Do you imply you can justify sending your data anywhere else?
Unfortunately, it’s not affordable to run your own LLM yet, but since I work in a regulated industry, it can’t go to China.
But you are correct the moment it becomes practical to do it in-house. It will be done that way.