Skip to content

[Investigate] Custom llama.dll Dependency Resolution Issues on Windows #12

Closed
@MillionthOdin16

Description

@MillionthOdin16

This is a note for using a custom llama.dll build on Windows. I ran into dependency resolution issues with loading my own llama.dll compiled with BLAS support and some extra hardware specific optimization flags. No matter what I do, it can't seem to locate all of its dependencies, even though I've tried placing them in system paths and even same dir.

My current workaround is using the default llama.dll that llama-cpp-python builds, but it doesn't have the hardware optimizations and BLAS compatibility that I enabled in my custom build. So, I'm still trying to figure out what my issue is. Maybe something python specific that i'm missing...

I'm dropping this issue here just in case anyone else runs into something similar. If you have any ideas or workarounds, let me know. I'll keep trying to figure it out until I get it resolved haha :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions