wuphysics87 to Privacy@lemmy.ml • 5 months agoCan you trust locally run LLMs?message-square18fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1message-squareCan you trust locally run LLMs?wuphysics87 to Privacy@lemmy.ml • 5 months agomessage-square18fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-square@stink@lemmygrad.mllinkfedilinkEnglish0•5 months agoIt’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
deleted by creator
It’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
deleted by creator