• Just_Pizza_Crust@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    If you have decent hardware, running ‘Oobabooga’ locally seems to be the best way to achieve decent results. Not only can you remove the limitations through running uncensored models (wizardlm-uncensored), but can prompt the creation of more practical results by writing the first part of the AI’s response.