wenc 9 hours ago

This is good, but could also be good to mention that you're using umap for dimensionality reduction with cosine metric.

https://github.com/Z-Gort/Reservoirs-Lab/blob/main/src/elect...

Dimensionality reduction from n >> 2 dimensions to 2 dimensions can be very fickle, so the hyperparameters matter. Your visualization can change significantly significantly depending on choice of metric.

https://umap-learn.readthedocs.io/en/latest/parameters.html

You may want to consider projecting to more than 2 dimensions too. You may ask, how does one visualize more than two dimensions? Through a scatterplot matrix of 2 axes at a time.

https://seaborn.pydata.org/examples/scatterplot_matrix.html

These are used for PCA-type multivariate analyses to visualize latent variables in higher dimensions than 2, but 2 dimensions at a time. Some clustering behavior that cannot be seen in 2 axes might be seen in higher dimensions. We used to do this our lab to find anomalies in high dimensions.

  • isoprophlex 7 hours ago

    About fickleness... indeed i've found this a kinda problematic thing when running large-d text embeddings through umap -- it always comes out spherical, blob-shaped, without any obvious segregation in the low-d projected space.

    IMO it's very difficult to make a "fire and forget" embedding interpreter. Maybe I never found the right parameters to umap but the results of running it (or any dimension reduction algo) always left me a bit underwhelmed.

    • antman 6 hours ago

      Have you tried PaCMAP? It should be better and faster

gregncheese 8 hours ago

I have yet to find a better tool than the old Tensorflow projector: https://projector.tensorflow.org/

Granted, it requires to prepare your data into TSV files first.

  • wenc 7 hours ago

    That is indeed an excellent tool. Allows one to dynamically adjust and recompute umap and t-sne.

z-gort 12 hours ago

lmk if anyone has any thoughts...if I could go back I may have not gone through Electron

Doing dimensionality reduction locally posed a few challenges in terms of application size--the idea was that by analyzing just a few thousand randomly sampled points you can get an idea of your data through a local GUI where you interact with your data and see some correlated metadata.

Not sure if there's too much need for an individual GUI to go along with Postgres as a VectorDB, maybe people just do analysis separate from a normal "GUI"? But maybe not.

What you think?

  • maxchehab 11 hours ago

    Just some fast feedback, I can't copy & paste in the connection url input form. On a mac.

    Once loaded, I get the error "Table must contain a UUID column for vector visualization."

    I'm assuming it's trying to find an ID column for grouping? Can we manually specify this? My ID columns are varchars.

    • garybake 7 hours ago

      Same here. I'm using langchain which creates a varchar id column. It also has different collections on the same table.

samanthasu 3 hours ago

That is excellent visualization!

ddtaylor 8 hours ago

Does this use pgVector?

  • z-gort 2 hours ago

    It lets you visualize any column with type "EMBEDDING", and I think the only way to get that is through pgvector/pgvectorscale.

dmezzetti an hour ago

Very interesting, thanks for sharing!

thangngoc89 11 hours ago

As a non-native English speaker and not very familiar with vector database, the title seems very ambiguous to me. I understand it as Postgres as a GUI for some VectorDB. Upon closer inspection, I realized that "Postgres as a VectorDB" is a full name. Maybe shorten that thing to something else. Just my 2 cents.

  • colechristensen 10 hours ago

    It’s just plain bad grammar, the title should be

    “Show HN: Reservoirs Lab, a Postgres VectorDB GUI”

    • monsieurbanana 10 hours ago

      I think the confusing term is "VectorDB" which sounds like a name of an existing product. "A vector db GUI powered by Postgres"?