I am a teacher and I have a LOT of different literature material that I wish to study, and play around with.

I wish to have a self-hosted and reasonably smart LLM into which I can feed all the textual material I have generated over the years. I would be interested to see if this model can answer some of my subjective course questions that I have set over my exams, or write small paragraphs about the topic I teach.

In terms of hardware, I have an old Lenovo laptop with an NVIDIA graphics card.

P.S: I am not technically very experienced. I run Linux and can do very basic stuff. Never self hosted anything other than LibreTranslate and a pihole!

  • pushECX@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 month ago

    I’d recommend trying LM Studio (https://lmstudio.ai/). You can use it to run language models locally. It has a pretty nice UI and it’s fairly easy to use.

    I will say, though, that it sounds like you want to feed perhaps a large number of tokens into the model, which will require a model made for a large context length and may require a pretty beefy machine.