PrivateGPT and GLIBC version in Replit

I’m trying to run PrivateGPT on my Replit, but there is a GLIBC compatibility issue. This is a package issue I know, but is there a potential work around to get the package installed on Replit?

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [13 lines of output]
/tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake: /lib/x8664-linux-gnu/libc.so.6: version GLIBC2.33' not found (required by /nix/store/mdck89nsfisflwjv6xv8ydj7dj0sj2pn-gcc-11.3.0-lib/lib/libstdc++.so.6) /tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake: /lib/x8664-linux-gnu/libc.so.6: version GLIBC2.32’ not found (required by /nix/store/mdck89nsfisflwjv6xv8ydj7dj0sj2pn-gcc-11.3.0-lib/lib/libstdc++.so.6)
/tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake: /lib/x8664-linux-gnu/libc.so.6: version GLIBC2.34' not found (required by /nix/store/mdck89nsfisflwjv6xv8ydj7dj0sj2pn-gcc-11.3.0-lib/lib/libstdc++.so.6) /tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake: /lib/x8664-linux-gnu/libc.so.6: version GLIBC2.34’ not found (required by /nix/store/mdck89nsfisflwjv6xv8ydj7dj0sj2pn-gcc-11.3.0-lib/lib/libgcc_s.so.1)
Traceback (most recent call last):
File “/tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/skbuild/setuptools_wrap.py”, line 645, in setup
cmkr = cmaker.CMaker(cmake_executable)
File “/tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/skbuild/cmaker.py”, line 148, in init
self.cmakeversion = getcmakeversion(self.cmakeexecutable)
File “/tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/skbuild/cmaker.py”, line 105, in getcmakeversion
raise SKBuildError(msg) from err

  Problem with the CMake installation, aborting build. CMake executable is /tmp/pip-build-env-51apicer/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

ldd --version
ldd (GNU libc) 2.35

1 Like

Try adding glibc to your replit.nix file, something like this:

{ pkgs }: {
  # ...
  env = {
    PYTHON_LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath [
      # ...
      pkgs.glibc
      # ...
    ];
    # ...
  };
}

Thank you. That didn’t seem to work.

Here is my replit.nix code

{

 pkgs }: {
  deps = [
    pkgs.glusterfs
    pkgs.adoptopenjdk-bin
    pkgs.sudo
    pkgs.python310Full
    pkgs.replitPackages.prybar-python310
    pkgs.replitPackages.stderred
  ];
  env = {
    PYTHON_LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath [
      pkgs.stdenv.cc.cc.lib
      pkgs.zlib
      pkgs.glib
      pkgs.xorg.libX11
      pkgs.glibc
    ];
    PYTHONHOME = "${pkgs.python310Full}";
    PYTHONBIN = "${pkgs.python310Full}/bin/python3.10";
    LANG = "en_US.UTF-8";
    STDERREDBIN = "${pkgs.replitPackages.stderred}/bin/stderred";
    PRYBAR_PYTHON_BIN = "${pkgs.replitPackages.prybar-python310}/bin/prybar-python310";
  };
}

I’m now getting the following error:

~/.../gpt/privateGPT$ pip3 install -r requirements.txt
Looking in indexes: https://package-proxy.replit.com/pypi/simple/
Collecting langchain==0.0.177 (from -r requirements.txt (line 1))
  Using cached https://package-proxy.replit.com/pypi/packages/e6/2a/b108911383cbebe658208e771de45663d7ce0b851315d9825bd73e816550/langchain-0.0.177-py3-none-any.whl (877 kB)
Collecting gpt4all==0.2.3 (from -r requirements.txt (line 2))
  Using cached https://package-proxy.replit.com/pypi/packages/b0/e7/9ddf095033e2029ababaad297386c74526fff12c7b1ec5c54879890fb755/gpt4all-0.2.3-py3-none-manylinux1_x86_64.whl (329 kB)
Collecting chromadb==0.3.23 (from -r requirements.txt (line 3))
  Using cached https://package-proxy.replit.com/pypi/packages/b8/74/29f431b81db5c4c1b4e1a6ab851f82db59b593a9f0f2858f8eb044df2809/chromadb-0.3.23-py3-none-any.whl (71 kB)
Collecting llama-cpp-python==0.1.50 (from -r requirements.txt (line 4))
  Using cached https://package-proxy.replit.com/pypi/packages/82/2c/9614ef76422168fde5326095559f271a22b1926185add8ae739901e113b9/llama_cpp_python-0.1.50.tar.gz (1.2 MB)
  Installing build dependencies ... error
  error: subprocess-exited-with-error
  
  × pip subprocess to install build dependencies did not run successfully.
  │ exit code: 127
  ╰─> [1 lines of output]
      /usr/bin/env: symbol lookup error: /nix/store/4nlgxhb09sdr51nc9hdm8az5b08vzkgx-glibc-2.35-163/lib/libc.so.6: undefined symbol: _dl_fatal_printf, version GLIBC_PRIVATE
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 127
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Is it posible to specify a version of glibc?

You can search for it here, but from a quick search it looks like only that version is available.