SearchAI PrivateLLM Installation

For Windows

This guide outlines the steps to install SearchAI PrivateLLM on a Windows server after SearchBlox has been installed.

📘

Prerequisites:

SearchBlox has been installed following the instructions in Installing on Windows.

Steps to Install SearchAI PrivateLLM

  1. After the installation of SearchBlox from Installing on Windows section.

  2. Navigate to Ollama Directory:
    Open Windows Explorer and navigate to C:\SearchBloxServer\ollama.

  3. Download and Extract Ranker Model:
    Download the ranker.zip file from the following link: ranker.zip
    Extract the contents of the zip file.

  4. Place Ranker Model:
    Navigate to <SearchBlox-installation>/webapps/ROOT/models.
    Copy the extracted ranker folder into this directory.

  5. Download Ollama Models:
    Download the models.zip file for Ollama from:
    SearchAI PrivateLLM models

  6. Extract and Replace Models Folder:
    Extract the downloaded models folder.
    Navigate to C:\\SearchBloxServer\ollama\models.
    Replace the existing models folder with the extracted one.

  7. Run Ollama Setup:
    Navigate to C:\SearchBloxServer\ollama and run ollamaSetup.exe.

  8. Set Environment Variables:
    Set OLLAMA_MODELS and environment path variable as shown in the following:

  • Follow step 1 and 2 from Prerequisites

  • Create a new User Variable OLLAMA_MODELS with value C:\SearchBloxServer\ollama\models

  • Click on OK.

  1. Restart the Windows Server.
  2. Verify Installation:
    After the server restarts, open a web browser and navigate to http://localhost:<port>, where is the Ollama port (default is 11434). This will confirm that SearchAI PrivateLLM is running.

For CentOS, RHEL/Ubuntu/Amazon Linux 2

Steps to Install and Run SearchAI PrivateLLM

  1. After SearchBlox installation from Installation section, Download the Models folder from the following link.
    SearchAI PrivateLLM Models
  2. After downloading , extract the models folder.
  3. Navigate to /opt/searchblox/ollama/models and replace the models folder with the downloaded one.
  4. To create a service file, navigate to /etc/systemd/system/ and create a ollama.service file by running the following command:
    vi /etc/systemd/system/ollama.service
  5. Update the /etc/systemd/system/ollama.service file with following script:
    [Unit]
    Description=Ollama Service
    After=network-online.target
    [Service]
    WorkingDirectory=/opt/searchblox/ollama/bin
    ExecStart=/opt/searchblox/ollama/bin/ollama serve
    User=root
    Group=root
    Restart=always
    RestartSec=3
    Environment="OLLAMA_MODELS=/opt/searchblox/ollama/models/"
    [Install]
    WantedBy=default.target
    
  6. Provide execute permission for ollama.service file by executing the following command:
    chmod 755 /etc/systemd/system/ollama.service
    chmod 755 /opt/searchblox/ollama/bin/ollama
  7. Enable the ollama.service by running the following command:
    systemctl demon-reload
  8. Start ollama Service by running the following command:
    systemctl start ollama
  9. To check the models run the following command:
    ./opt/searchblox/ollama/bin/ollama list
  10. To stop Ollama Service run the following command:
    systemctl stop ollama
  11. To auto start Ollama Service when system reboots run the following command:
    systemctl enable ollama