SearchAI PrivateLLM Installation

For Windows

Steps to Install SearchAI PrivateLLM

  1. After the installation of SearchBlox from Installing on Windows section. Navigate to C:\SearchBloxServer\ollama

  2. Download and extract the ranker model folder from following link:
    ranker.zip

  3. Navigate to <SearchBlox-installation>/webapps/ROOT/models and place the ranker model downloaded from the previous step.

  4. Download the models folder for Ollama from following link:
    SearchAI PrivateLLM models

  5. Extract the downloaded models folder and navigate to C:\SearchBloxServer\ollama\models.

  6. Replace the models folder with the downloaded one.

  7. Go to C:\SearchBloxServer\ollama and run ollamaSetup.exe.

  8. Set OLLAMA_MODELS and environment path variable as shown in the following:

    • Follow step 1 and 2 from Prerequisites

    • Create a new User Variable OLLAMA_MODELS with value C:\SearchBloxServer\ollama\models

    • Click on OK.

  9. Restart the Windows Server.

  10. After restart, open the following link to checkSearchAI PrivateLLM, default port is 11434.
    http://localhost:<port>

For CentOS, RHEL/Ubuntu/Amazon Linux 2

Steps to Install and Run SearchAI PrivateLLM

  1. After SearchBlox installation from Installation section, Download the Models folder from the following link.
    SearchAI PrivateLLM Models
  2. After downloading , extract the models folder.
  3. Navigate to /opt/searchblox/ollama/models and replace the models folder with the downloaded one.
  4. To create a service file, navigate to /etc/systemd/system/ and create a ollama.service file by running the following command:
    vi /etc/systemd/system/ollama.service
  5. Update the /etc/systemd/system/ollama.service file with following script:
    [Unit]
    Description=Ollama Service
    After=network-online.target
    [Service]
    WorkingDirectory=/opt/searchblox/ollama/bin
    ExecStart=/opt/searchblox/ollama/bin/ollama serve
    User=root
    Group=root
    Restart=always
    RestartSec=3
    Environment="OLLAMA_MODELS=/opt/searchblox/ollama/models/"
    [Install]
    WantedBy=default.target
    
  6. Provide execute permission for ollama.service file by executing the following command:
    chmod 755 /etc/systemd/system/ollama.service
    chmod 755 /opt/searchblox/ollama/bin/ollama
  7. Enable the ollama.service by running the following command:
    systemctl demon-reload
  8. Start ollama Service by running the following command:
    systemctl start ollama
  9. To check the models run the following command:
    ./opt/searchblox/ollama/bin/ollama list
  10. To stop Ollama Service run the following command:
    systemctl stop ollama
  11. To auto start Ollama Service when system reboots run the following command:
    systemctl enable ollama