Skip to content

can't use mmap because of ggml? #190

@aicoder2048

Description

@aicoder2048

llama.cpp: can't use mmap because tensors are not aligned; convert to new format to avoid this
llama_model_load_internal: format = 'ggml' (old version with low tokenizer quality and no mmap support)

Metadata

Metadata

Assignees

No one assigned

    Labels

    primordialRelated to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions