Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to install lp-sparsemap #12

Open
ManishSharma1609 opened this issue Feb 28, 2024 · 1 comment
Open

Unable to install lp-sparsemap #12

ManishSharma1609 opened this issue Feb 28, 2024 · 1 comment

Comments

@ManishSharma1609
Copy link

I am facing an issue while trying to install the lp-sparsemap package. The installation process is failing with the following error:

I have attempted to upgrade the pip package using the command pip install --upgrade pip, but the issue persists.

Screenshot 2024-02-29 012147
Screenshot 2024-02-29 012222

@arek544
Copy link

arek544 commented Aug 5, 2024

I've encountered the same problem, and hopefully this one is straightforward. The solution is to setup the EIGEN_DIR variable with a path to eigen.

However, I still can't manage to install the package. The following errors occur:

      lpsmap/ad3ext/FactorTreeTurbo.h:24:1: note: ‘std::tie’ is defined in header ‘<tuple>’; did you forget to ‘#include <tuple>’?
         23 | #include "DependencyDecoder.h"
        +++ |+#include <tuple>
         24 |
      error: command '/usr/bin/gcc' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for lp-sparsemap
Failed to build lp-sparsemap
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (lp-sparsemap) 

The full error trace is in the gist below:
https://gist.github.com/KarolinaPondel/992cb3490e58377d0e0312b31f1fda57

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants