Analyzes files based on AI and offers the possibility to query them.
If this helped you, consider supporting my development over on Patreon or on Github.
Repository: LaravelAILabs/file-assistant-demo
URL: https://file-assistant.laravelailabs.com
You can install the package via composer:
composer require laravelailabs/file-assistant
You can publish the config file with:
php artisan vendor:publish --tag="file-assistant-config"
Currently using Pinecone.io as the vector database and OpenAI as the LLM. Planning to make it so any LLM can be used, as well as any Vector Database implemented in Laravel Vector Store.
Add the following secrets to your .env
:
VECTOR_STORE_PINECONE_API_KEY=YOUR_PINECONE_API_KEY
VECTOR_STORE_PINECONE_ENVIRONMENT=YOUR_PINECONE_ENVIRONMENT
FILE_ASSISTANT_OPENAI_API_KEY=YOUR_OPENAPI_KEY
FILE_ASSISTANT_PINECONE_DATASET=YOUR_PINECONE_INDEX_NAME
You can find your OpenAI API Key here.
$dialog = FileAssistant::addFile('PATH_TO_YOUR_FILE')
->addFile('PATH_TO_YOUR_SECOND_FILE')
->initialize();
echo $dialog->prompt('What is this document about?')
$dialog = FileAssistant::setConversation(Conversation::find(1))
->setUser(Auth::user())
->initialize();
// grab the conversation and display the messages
/**
* @var \LaravelAILabs\FileAssistant\Models\Conversation $conversation
*/
$conversation = $dialog->getConversation();
foreach ($conversation->messages as $message) {
echo sprintf('%s: %s <br>', $message->role, $message->content);
}
echo $dialog->prompt('Where did we leave off?')
The package creates 3 tables: conversations
, messages
, files
, conversation_files
. Feel free to modify their names using the file-assistant config environment variables. Use the models to interact with the database and display the messages of a conversation.
- Word
- TXT
composer test
Please see CHANGELOG for more information on what has changed recently.
The MIT License (MIT). Please see License File for more information.