SearchAI PrivateLLM Installation
For Windows
Steps to Install SearchAI PrivateLLM
-
After the installation of SearchBlox from Installing on Windows section. Navigate to
C:\SearchBloxServer\ollama -
Download and extract the
rankermodel folder from following link:
ranker.zip -
Navigate to
<SearchBlox-installation>/webapps/ROOT/modelsand place therankermodel downloaded from the previous step.
-
Download the models folder for Ollama from following link:
SearchAI PrivateLLM models -
Extract the downloaded
modelsfolder and navigate toC:\SearchBloxServer\ollama\models. -
Replace the
modelsfolder with the downloaded one. -
Go to
C:\SearchBloxServer\ollamaand runollamaSetup.exe. -
Set
OLLAMA_MODELSand environment path variable as shown in the following:-
Follow step 1 and 2 from Prerequisites
-
Create a new User Variable
OLLAMA_MODELSwith valueC:\SearchBloxServer\ollama\models -
Click on
OK.
-
-
Restart the Windows Server.
-
After restart, open the following link to check
SearchAI PrivateLLM, default port is 11434.
http://localhost:<port>
For CentOS, RHEL/Ubuntu/Amazon Linux 2
Steps to Install and Run SearchAI PrivateLLM
- After SearchBlox installation from Installation section, Download the Models folder from the following link.
SearchAI PrivateLLM Models - After downloading , extract the
modelsfolder. - Navigate to
/opt/searchblox/ollama/modelsand replace themodelsfolder with the downloaded one. - To create a service file, navigate to
/etc/systemd/system/and create aollama.servicefile by running the following command:
vi /etc/systemd/system/ollama.service - Update the
/etc/systemd/system/ollama.servicefile with following script:[Unit] Description=Ollama Service After=network-online.target [Service] WorkingDirectory=/opt/searchblox/ollama/bin ExecStart=/opt/searchblox/ollama/bin/ollama serve User=root Group=root Restart=always RestartSec=3 Environment="OLLAMA_MODELS=/opt/searchblox/ollama/models/" [Install] WantedBy=default.target - Provide execute permission for
ollama.servicefile by executing the following command:
chmod 755 /etc/systemd/system/ollama.service
chmod 755 /opt/searchblox/ollama/bin/ollama - Enable the
ollama.serviceby running the following command:
systemctl demon-reload - Start ollama Service by running the following command:
systemctl start ollama - To check the models run the following command:
./opt/searchblox/ollama/bin/ollama list - To stop Ollama Service run the following command:
systemctl stop ollama - To auto start Ollama Service when system reboots run the following command:
systemctl enable ollama
Updated 11 months ago
