Why not using linear regression for finetuning the last layer of a neural network?

In transfer learning, often only the last layer of the network is retrained using gradient descent. However, the last layer of a common neural network performs only a linear transformation, so why do we use gradient descent and not linear (or logistic) regression to finetune the last layer?
Category: Data Science

How to summarize very large neural networks?

I am doing a lot of work with transfer learning at the moment (using keras and tensorflow if that is relevant). I am having a lot of issues in sufficiently summarizing the very large models. This post: How do you visualize neural network architectures? shows a lot of useful methods for visualizing architectures, and they are great for networks such VGG16, but none of them are reasonable to include in a report if the models are very large (such as …
Category: Data Science

To freeze or not, batch normalisation in ResNet when transfer learning

I'm using a ResNet50 model pretrained on ImageNet, to do transfer learning, fitting an image classification task. The easy way of doing this is simply freezing the conv layers (or really all layers except the final fully connected layer), however I came across a paper where the authors mention that batch normalisation layers should be fine tuned when fitting the new model: Few layers such as Batch Normalization (BN) layers shouldn’t be froze because, the mean and variance of the …
Category: Data Science

Is there wights of voice or audio for VGG or Inception?

I want to use VGG16 (or VGG19) for voice clustering task. I read some articles which suggest to use VGG (16 or 19) in order to build the embedding vector for the clustering algorithm. The process is to convert the wav file into mfcc or plot (Amp vs Time) and use this as input to VGG model. I tried it out with VGG19 (and weights='imagenet'). I got bad results, and I assumed it because I'm using VGG with wrong weights …
Category: Data Science

model.fit vs model.evaluate gives different results?

The following is a small snippet of the code, but I'm trying to understand the results of model.fit with train and test dataset vs the model.evaluate results. I'm not sure if they do not match up or if I'm not understanding how to read the results? batch_size = 16 img_height = 127 img_width = 127 channel = 3 #RGB train_dataset = image_dataset_from_directory(Train_data_dir, shuffle=True, batch_size=batch_size, image_size=(img_height, img_width), class_names = class_names) ##Transfer learning code from mobilenetV2/imagenet here to create model initial_epochs = …
Category: Data Science

Dealing with little available data: transfer learning

Suppose I seek to predict a certain numerical value, whereby the data set which contains the predetermined correct labels is only very small. However, I'm also provided a large data set with a label that is correlated to the one I want to predict. I read that transfer learning could be used to make use of this larger data set for predicting the desired label from the smaller data set. Could someone elaborate a bit on this?
Category: Data Science

Pretrained vs. finetuned model

I have a doubt regarding terminology. When dealing with huggingface transformer models, I often read about "using pretrained models for classification" vs. "fine-tuning a pretrained model for classification." I fail to understand what the exact difference between these two is. As I understand, pretrained models by themselves cannot be used for classification, regression, or any relevant task, without attaching at least one more dense layer and one more output layer, and then training the model. In this case, we would …
Category: Data Science

Simple question about prediction classes of item in question vs not item in question

Let's say I wanted to use transfer learning to train a model to detect object A vs everything else. In this case, do I provide 2 types of input, images of object A and images of everything else, and then have the final layer of the model output either object A or not-object A? What about in the case where I want object A vs object B vs everything else. Would it make sense in this case to provide images …
Category: Data Science

Transfer learning: Poor performance with last layer replaced

I am using a transfer learning approach. For this I followed the tensorflow for poets tutorial. I use a pre-trained InceptionV3 architecture trained on the Imagenet dataset. The last layer and the softmax classification is replaced and retrained, using a new set of 7 classes. Data Per class I have around 4.000 - 5.000 images. I tried multiple training parameters with an AdamOptimizer. The labels are noisy, about 15-20% of the labels are incorrect. The images show products of a …
Category: Data Science

When can it be called transfer learning?

A common definition of transfer learning is: "Transfer learning is the improvement of learning in a new task through the transfer of knowledge from a related task that has already been learned." — Chapter 11: Transfer Learning, Handbook of Research on Machine Learning Applications, 2009. This raises the question, when a task can be termed "related". Let's assume a neural networks is trained to estimate house prices for american houses. Could it be called transfer learning, if I retrain/finetune the …
Category: Data Science

Distribution Shift vs Transfer Learning

Transfer learning (TL) is a research problem in machine learning (ML) that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem [1] Distribution Shift The conditions under which the system was developed will differ from those in which we use the system. [2] I consider there is no difference between distribution shift and dataset shift. But between transfer learning and distribution shift? What are the differences? Can we say that transfer …
Category: Data Science

Negative examples for a Yes/No image classification neural network

I am trying to retrain a neural network using transfer learning that can classify whether an image has a certain object, say, a car. My positive sample dataset is quite small, only 2500~ images. It works really well with "regular" binary classification (2500 images of cars/2500 images of flowers and it has to differentiate between those two), but the problem is that I am not sure how to make it classify for all types of images, or how to make …
Category: Data Science

What are the theoretical differences of multitask learning vs fine tuning based transfer learning?

Suppose, I have the following scenarios: I have a bunch of fruits, i.e., apple, orange, and banana. I simply made a multitask model, where my network first tell me which fruit it is, and then telling me the color of it. Suppose, if I give my network an apple, it tells me, (a) it is apple, (b) it is red. By doing some theoretical study, I have understood that it is one type of inductive transfer learning (TL) (correct me, …
Category: Data Science

Transfer learning between Language Model and classification

Following this fast.ai lecture, I am trying to understand the mechanism of Transfer Learning in NLP from a general Language Model (LM) to a classification problem. What is exactly taken from the Language Model training? Is it just the word embeddings? Or is it also the weights of the LSTM cell? The architecture of the neural net should be quite different - where in a LM you would output a prediction after every sequence-step, in a classification problem you would …
Category: Data Science

Model transfer with limit to none label information

I have this problem I hope to get some help here. Say I have a type of product A whose measurements are X_A and an outcome property is y_A. y_A is a continuous variable. Then I can have a predictive model out of it using X_A, y_A. Now I have a product B. It's similar to product A but not exactly the same, like an orange to a grapefruit. For product B, I have plenty of X_B measurements, but very …
Category: Data Science

Cannot achieve good result while Transfer Learning CIFAR-10 on ResNet50 - Keras

I'm trying to Transfer Learn ResNet50 for image classification of the CIFAR-10 dataset. It's stated in the original paper and also ResNet50 documentation on keras.io that the ResNet should have a minimum input shape of 32x32. But I cannot achieve any good results. Here I have created and compiled the sequential model: model = Sequential() model.add(ResNet50(include_top=False, weights='imagenet', input_shape=(32,32,3))) model.add(Flatten()) model.add(BatchNormalization()) model.add(Dense(128, activation='relu')) #Dense Layer model.add(Dropout(0.5)) #Dropout model.add(Dense(10, activation='softmax')) #Output Layer model.layers[0].trainable = False #Set ResNet as NOT trainable model.summary() model.compile(loss='categorical_crossentropy', …
Category: Data Science

Train model using Transfer Learning, the validation accuracy not learning

I am new to transfer learning. I am doing face mask detection in 4 classes(no facemask wearing, incorrect facemask wearing, correct facemask wearing, double mask wearing). My objective is to compare different transfer learning models and compare the accuracy, but I stuck with the first model (MobileNetV2). Method I had tried: I have tried changing my dataset a few times and mixing the dataset of different distributions but the validation and testing accuracy are not able to improve. (no improve) …
Category: Data Science

How to load a saved model in TensorFlow?

This is my code in Python: from __future__ import absolute_import, division, print_function, unicode_literals import tensorflow as tf from matplotlib import pyplot as plt import numpy as np I checked if the saved model is there using the following code: tf.compat.v1.saved_model.contains_saved_model( '/Link_to_the_saved_model_directory/' ) which returns True and I can use the following code to further make sure the model is saved correctly, as far as I understood: tf.saved_model.Asset( '/Link_to_the_saved_model_directory/' ) which returns this: <tensorflow.python.training.tracking.tracking.Asset at 0x2aad125e5710> So, everything looks fine. But, …
Category: Data Science

Transfer learning with many small datasets

Context I am working on a NLP-model that can classify documents into one of N categories. I have document data from a number of different customers. The document topics are similar across customers but they classify them into different categories. For simplicity, assume that the documents can contain six different topics: A,B,...,F. Each customer classify the documents differently from the topics, i.e. N mentioned above is customer specific and the mix of topics is (somewhat) different: Customer 1 have three …
Category: Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.