You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Helena!
Let's say my data are represented (X) with a sparse positive integer vector of dimension about 20,000.
And let's say I have a few thousands such data points.
Will ugtm work, or not?
I am asking before I start to invest manpower into using ugtm...
Thanks a lot!
F.
The text was updated successfully, but these errors were encountered:
UnixJunkie
changed the title
Will it work on high dimensional (but sparse) encoded data?
Will it work on high dimensional (but sparse) encoded data? [Question]
Oct 6, 2021
Hi there! It would work but in my experience the method is very sensitive to the curse of dimensionality, which is one of its main drawbacks: you will have lots of points mapped to the same coordinates. I'd use PCA preprocessing to get a 50-100 dimensional feature space (depending on the % variance explained...)
Could the package be parameterized by a distance function?
I.e. even if the points are high-dimensional, I provide a distance function which behaves well on them
and the calculation should use the distance function I provide rather than processing the X coordinates directly.
Hello, this might be a silly question, but I hope to get your reply.
What is the maximum dimension that GTM can typically handle?
Additionally, after using PCA for dimensionality reduction, is the dimensionality of the GTM inverse mapping also reduced? If I want the inverse mapping to return to the original dimensions, can I still use PCA for dimensionality reduction?
Hi Helena!
Let's say my data are represented (X) with a sparse positive integer vector of dimension about 20,000.
And let's say I have a few thousands such data points.
Will ugtm work, or not?
I am asking before I start to invest manpower into using ugtm...
Thanks a lot!
F.
The text was updated successfully, but these errors were encountered: