Created
May 12, 2017 17:04
-
-
Save hakanserce/9cac8d291af679331da265839b3b7400 to your computer and use it in GitHub Desktop.
Using KFold cross correlation for hyper-parameter optimization...
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # This gist uses RBF Kernel based Ridge Regression: https://gist.github.com/hakanserce/fdd571132ef44a6a8f7ddd2eb41aba84 | |
| kf = KFold(y_train.size, n_folds=5) | |
| X_all = np.hstack((X_train, X_test)) | |
| y_all = np.hstack((y_train, y_test)) | |
| def get_cross_validated_mses(rbf_lambda, d2): | |
| for train, test in kf: | |
| X_train_cv, X_test_cv, y_train_cv, y_test_cv = X_all[train], X_all[test], y_all[train], y_all[test] | |
| # train your model | |
| model_cv = BasisFunctionRidgeRegression(RBFKernelTransform(X_train_cv, rbf_lambda), d2) | |
| model_cv.train(X_train_cv, y_train_cv) | |
| # return MSE | |
| yield model_cv.mse(X_test_cv, y_test_cv) | |
| def get_cross_validated_mse(rbf_lambda, d2): | |
| mses = np.fromiter(get_cross_validated_mses(rbf_lambda,d2), dtype=float) | |
| return np.mean(mses) | |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment