Siamese networks vs Semantic similarity (may be gensim)

I am trying to understand the Siamese networks . In this vector is calculated for an object (say an image) and a distance metric is applied (say manhatten) on two vectors produced by the neural network(s). The idea was applied mostly to images in the tutorials provided on internet.

If I compare it with Gensim semantic similarity, there also we have vectors of two objects (words or sentences) and then do a cosine similarity to calculate the difference. (remember example of King-man+woman=Queen).

Am I missing some aspects of Siamese networks or these are actually same?

Topic semantic-similarity siamese-networks cnn gensim cosine-distance

Category Data Science


Siamese and semantic similarity are not same. See example here: Siamese are effectively used for image classifications , but there is a paper publication by Cornell university for text classification as well. Now for your question, we train siamese in a way such that one is positive and one is negative. Lets talk about job title called java developer , we train it with java developer as input and j2ee developer as output in one network. In another network we train java developer as input and QA engineer as output but we dont change weights of the network.

Since we train a neural network with positive and negative so that siamese networks learns the positives and hence its also called one shot learning etc..

Now coming back to semantic similarity- its a unsupervised learning wherein the network will place similar objects together, the cosine distance that we get from gensim is how close one object to the other.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.