• Veritas@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 years ago

    I’m most excited about the upcoming Vicuna 65B and other LLMs with 100k+ context that can basically get a whole book or large source code as input.