There are many possibilities, but I suspect the tuning algorithms are overfitting your model. The multisearch/gridsearch etc. algorithms are selecting a combination of hyperparameters that optimize a metric of your choosing such as AUC/F1/MCC or something similar. If you are optimizing on the training data, the tuning algorithm will select the model with highest training score, but this will probably be overfitted.
Without the tuning algorithm, you can, by chance, select hyperparameters that perform worse on the training data but better on the testing data. This is particularly true if you do not have many instances on which to train/test; if you do not have many training instances, overfitting is even more likely. Moreover, if your testing set is very small, luck becomes an even greater factor.
Your features can also contribute to overfitting. For example, suppose you have a million features but only 100 training instances. If the features deviate significantly per instance, your model can very selectively optimize on the training data, but this results in poor generalization. If you have many more features than training instances, you should utilize dimensionality reduction algorithms such as PCA to create a better search space.
To get the best results with tuning algorithms, you should partition your data into training, validation, and testing sets. The multisearch/gridsearch should then evaluate the models on the validation set so that you encourage generalization rather than overfitting to training data.
Also, your interval selection for the grid search is important. Suppose you are using an SVM with RBF kernel and grid searching the gamma and C parameters, your grid cells should probably deviate exponentially. An example 11x11 grid might be: $ C \in \{10^{-3}, 10^{-2},...,10^7\} $ and $ \gamma \in \{10^{-9},10^{-8},...,10\} $. If you want finer granularity, after creating a heatmap of scores for your grid, you can "zoom into" an area of high scores by performing another grid search with finer intervals of a smaller domain.