SearchAI PrivateLLM Installation
For Windows
Steps to Install SearchAI PrivateLLM
-
After the installation of SearchBlox from Installing on Windows section. Navigate to
C:\SearchBloxServer\ollama
-
Download and extract the
ranker
model folder from following link:
ranker.zip -
Navigate to
<SearchBlox-installation>/webapps/ROOT/models
and place theranker
model downloaded from the previous step. -
Download the models folder for Ollama from following link:
SearchAI PrivateLLM models -
Extract the downloaded
models
folder and navigate toC:\SearchBloxServer\ollama\models
. -
Replace the
models
folder with the downloaded one. -
Go to
C:\SearchBloxServer\ollama
and runollamaSetup.exe
. -
Set
OLLAMA_MODELS
and environment path variable as shown in the following:-
Follow step 1 and 2 from Prerequisites
-
Create a new User Variable
OLLAMA_MODELS
with valueC:\SearchBloxServer\ollama\models
-
Click on
OK
.
-
-
Restart the Windows Server.
-
After restart, open the following link to check
SearchAI PrivateLLM
, default port is 11434.
http://localhost:<port>
For CentOS, RHEL/Ubuntu/Amazon Linux 2
Steps to Install and Run SearchAI PrivateLLM
- After SearchBlox installation from Installation section, Download the Models folder from the following link.
SearchAI PrivateLLM Models - After downloading , extract the
models
folder. - Navigate to
/opt/searchblox/ollama/models
and replace themodels
folder with the downloaded one. - To create a service file, navigate to
/etc/systemd/system/
and create aollama.service
file by running the following command:
vi /etc/systemd/system/ollama.service
- Update the
/etc/systemd/system/ollama.service
file with following script:[Unit] Description=Ollama Service After=network-online.target [Service] WorkingDirectory=/opt/searchblox/ollama/bin ExecStart=/opt/searchblox/ollama/bin/ollama serve User=root Group=root Restart=always RestartSec=3 Environment="OLLAMA_MODELS=/opt/searchblox/ollama/models/" [Install] WantedBy=default.target
- Provide execute permission for
ollama.service
file by executing the following command:
chmod 755 /etc/systemd/system/ollama.service
chmod 755 /opt/searchblox/ollama/bin/ollama
- Enable the
ollama.service
by running the following command:
systemctl demon-reload
- Start ollama Service by running the following command:
systemctl start ollama
- To check the models run the following command:
./opt/searchblox/ollama/bin/ollama list
- To stop Ollama Service run the following command:
systemctl stop ollama
- To auto start Ollama Service when system reboots run the following command:
systemctl enable ollama
Updated 8 days ago