You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Electron crashes when loadModel finishes loading (beta)
Expected Behavior
After loading a model using this code and trying to create a context, I'd expect there not to be a crash in llama-addon.node. I've tried with and without Metal enabled.
If a crash happens, I'd expect to be an error log from node-llama-cpp, but no error logs show up from node-llama-cpp either.
Actual Behavior
When creating context with llama 3 in Electron v28, electron crashes on runtime with an EXC_CRASH error (and no console logs)
(code below works fine in terminal node.js, which may suggest this is an issue on Electron's side, but I don't know enough to be sure, please let me know if you have ideas on where to start debugging.)
// In this file you can include the rest of your app's specific main process
// code. You can also put them in separate files and import them here.
async function main() {
let { getLlama, LlamaChatSession, Llama3ChatWrapper } = await import(
'node-llama-cpp'
);
const llama = await getLlama();
try {
console.log('Loading model');
let model = await llama.loadModel({
modelPath: path.join(
app.getPath('sessionData'),
`models`,
'Meta-Llama-3-8B-Instruct-Q4_K_M.gguf'
),
onLoadProgress(loadProgress: number) {
console.log(`Load progress: ${loadProgress * 100}%`);
},
});
const context = await model.createContext(); // crash happens here
const session = new LlamaChatSession({
contextSequence: context.getSequence(),
chatWrapper: new Llama3ChatWrapper(),
});
const a1 = await session.prompt('test ');
console.log(a1);
} catch (e) {
console.log(e);
}
}
main();
run the application
My Environment
Dependency
Version
Operating System
Sonoma 14.2.1
CPU
Apple M1 Max
Node.js version
v18.19
Typescript version
5.4.2
node-llama-cpp version
3.0.0-beta.17
Additional Context
No response
Relevant Features Used
Metal support
CUDA support
Grammar
Are you willing to resolve this issue by submitting a Pull Request?
Yes, I have the time, but I don't know how to start. I would need guidance.
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
Issue description
Electron crashes when loadModel finishes loading (beta)
Expected Behavior
After loading a model using this code and trying to create a context, I'd expect there not to be a crash in
llama-addon.node
. I've tried with and without Metal enabled.If a crash happens, I'd expect to be an error log from node-llama-cpp, but no error logs show up from node-llama-cpp either.
Actual Behavior
When creating context with llama 3 in Electron v28, electron crashes on runtime with an
EXC_CRASH
error (and no console logs)Steps to reproduce
src/main.ts
to load a model.(code below works fine in terminal node.js, which may suggest this is an issue on Electron's side, but I don't know enough to be sure, please let me know if you have ideas on where to start debugging.)
My Environment
node-llama-cpp
versionAdditional Context
No response
Relevant Features Used
Are you willing to resolve this issue by submitting a Pull Request?
Yes, I have the time, but I don't know how to start. I would need guidance.
The text was updated successfully, but these errors were encountered: