What’s the best affordable pre-built mini server available in Europe? I’m looking for a reliable and compact option that won’t break the bank

Edit: something that are not arm based

Edit 2: I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services. By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go

  • @delver@sh.itjust.works
    link
    fedilink
    English
    011 months ago

    I’m not sure if they’re still affordable but I ended up getting both a morefine and a beelink, one with the n100 Intel CPU and the other with the n305. They handle everything I’ve thrown at them, and come with out of the box quicksync transcoding for Jellyfin/Plex. Handles 4K transcode like a champ. Couple that with 2.5g Ethernet and they sip power. Though they might have gone up in price since I bought mine.

  • poVoq
    link
    fedilink
    English
    011 months ago

    You need to first explain what you want the server for, because that will give us an idea of your CPU and storage requirements.

    • @diy@sh.itjust.worksOP
      link
      fedilink
      English
      011 months ago

      I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services.

      • @Mixel@feddit.de
        link
        fedilink
        English
        011 months ago

        Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible

  • @B0rax@feddit.de
    link
    fedilink
    English
    011 months ago

    I like my HPE microserver gen10+

    Although it does not come with a GPU by default, but you can install a low power one.

  • @Getting6409@lemm.ee
    link
    fedilink
    English
    011 months ago

    I’ve had a good experience so far with two minipcs, mele quieter 4c for kodi, and a morefine m9 (I think this one is branded as mipowcat in the EU). They’re both n100, the m9 can go up to 32gb of ram although it is picky about what modules it will accept. I use the m9 for jellyfin and about 10 other services. Quick sync works great as far as I’ve tested it. For jellyfin I’m relying mostly on direct streaming, but I tried a few episodes with forcing some transcoding by using Firefox for playback and it worked fine.

  • @Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    0
    edit-2
    11 months ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    NVMe Non-Volatile Memory Express interface for mass storage
    Plex Brand of media server package
    SSD Solid State Drive mass storage

    3 acronyms in this thread; the most compressed thread commented on today has 16 acronyms.

    [Thread #785 for this sub, first seen 5th Jun 2024, 13:55] [FAQ] [Full list] [Contact] [Source code]

  • @bitchkat@lemmy.world
    link
    fedilink
    English
    011 months ago

    I love my NUCs but haven’t really paid attention to what has happened since Intel sold that line to ASUS.

  • @Blue_Morpho@lemmy.world
    link
    fedilink
    English
    0
    edit-2
    11 months ago

    How small? How many drives? I bought several used Lenovo P330 E2276G for my servers.

    The Intel CPU has great low power GPU for video encoding/decoding for video streaming.

    The Xeon ECC ram gives long term reliability. It’s important if you leave your PC on 24/7 for years at a time.

        • @diy@sh.itjust.worksOP
          link
          fedilink
          English
          011 months ago

          I just need something that works. I’ve had a bad experience with a previous model that wouldn’t boot on my Ubuntu server drive, no matter how much time I spent on it. But if you know of any models that are worth checking out, I’m all ears.

  • @Aux@lemmy.world
    link
    fedilink
    English
    011 months ago

    If you want to run Ollama and other ML stuff, you’re looking at buying an RTX4090, my friend. Affordable and ML are two things you can’t put into one sentence.

    • @486@lemmy.world
      link
      fedilink
      English
      011 months ago

      While you certainly can run AI models that require such a beefy GPU, there are plenty of models that run fine even on a CPU-only system. So it really depends on what exactly Ollama is going to be used for.

  • @forger125@lemmy.ml
    link
    fedilink
    English
    011 months ago

    You can try the Minisforum MS-01. Relatively compact, inexpensive, with a lot of options for expandability as well as relatively powerful Intel CPUs with QuickSync for LLM and transcode. Here is a nice overview of the device.

  • @HumanPerson@sh.itjust.works
    link
    fedilink
    English
    011 months ago

    I see people mentioning small office desktops, and they are good, but I will warn you that they use proprietary parts so upgrading and repairing them can be difficult. Also jellyfin.org has some good info under the hardware acceleration section for what to use.