[Exercise 4] Regularization

前端之家收集整理的这篇文章主要介绍了[Exercise 4] Regularization前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。

在这个练习中,会实现加入正则的线性回归和逻辑回归。数据:ex5Data.zip,数据拟合的过程中很容易出现的问题就是过拟合(over fitting),所以需要正则化来进行模型的选择。

实验基础

Regularized linear regression

五阶多项式

最小化的损失函数

回想一下我们前面的Normal equations:

Regularized logistic regression

假设函数

看看 x 这里有28-feature vector:

加入正则的损失函数

再看看Newton’s method的变化
update rule:

the gradient θ ( J ) and the Hessian H

这里参数说明很重要:

实验结果和程序

线性的正则化回归结果,这里的 λ 符号显示不太对…额

逻辑回归正则:


% Regularized linear regression
clc,clear

x = load('ex5Linx.dat');
y = load('ex5Liny.dat');
% plot the data
plot(x,y,'o','MarkerFaceColor','r')

x = [ones(size(x,1),x,x.^2,x.^3,x.^4,x.^5];
[m,n] = size(x);

diag_m = diag([0;ones(n-1,1)]);

lambda = [0 1 10]';

colortype = {'g','b','r'};

theta =zeros(n,3)
xrange = linspace(min(x(:,2)),max(x(:,2)))'; hold on % normal equations for lambda_i = 1:length(lambda) theta(:,lambda_i) = inv(x'*x + lambda(lambda_i).*diag_m)*x'*y;
    yrange = [ones(size(xrange)) xrange xrange.^2 xrange.^3 xrange.^4 xrange.^5]*theta(:,lambda_i);
    plot(xrange',yrange,char(colortype(lambda_i)))
    hold on
end
legend('traning data','\lambda=0','\lambda=1','\lambda=10')
hold off

Logi.m

% Regularized logistic regression
clc,clear

x = load('ex5Logx.dat');
y = load('ex5Logy.dat');

% define sigmod function
g = inline('1.0 ./ (1.0 + exp(-z))');
% Initialize fitting paremeters
x = map_feature(x(:,x(:,2));
[m,n] = size(x);
lambda = [0 1 10];
theta = zeros(n,1);
MAX_ITR = 15;
J = zeros(MAX_ITR,1);
% Newton's method
for lambda_i = 1:length(lambda)
    x_plot = load('ex5Logx.dat');
    y_plot = load('ex5Logy.dat');
    figure
    % Find the indices for the 2 classes
    pos = find(y_plot); neg = find(y_plot == 0);

    plot(x_plot(pos,x_plot(pos,2),'+')
    hold on
    plot(x_plot(neg,x_plot(neg,'y')


    for i = 1:MAX_ITR
        z = x*theta;
        h = g(z);
        J(i) = (1/m).*sum(-y.*log(h) - (1 - y).*log(1 - h)) + (lambda(lambda_i)/(2*m))*norm(theta([2:end]))^2;
        % Calculate gradient and hessian
        G = (lambda(lambda_i)/m).*theta; G(1) = 0;
        L = (lambda(lambda_i)/m)*eye(n); L(1) = 0;
        grad = ((1/m).*x'*(h - y)) + G;
        H = ((1/m).*x'*diag(h)*diag(1 - h)*x) + L;

        theta = theta - H\grad;
    end
    % Plot
    J;
    norm_theta = norm(theta);
    % Define the range of the grid
    u = linspace(-1,1.5,200);
    v = linspace(-1,200);
    %Initialize space for the values to be plotted
    z = zeros(length(u),length(v));
    % Evaluate z = theta*x over the grid
    for k = 1:length(u)
        for j = 1:length(v)
            z(k,j) = map_feature(u(k),v(j))*theta;
        end
    end
    z = z';
    contour(u,v,z,[0,0],'LineWidth',2)
    legend('y=1','y=0','Decision boundary');
    title(sprintf('\\lambda = %g',lambda(lambda_i)),'FontSize',14)
    hold off

end

map_feature.m

function out = map_feature(feat1,feat2)
% MAP_FEATURE Feature mapping function for Exercise 5
%
% map_feature(feat1,feat2) maps the two input features
% to higher-order features as defined in Exercise 5.
%
% Returns a new feature array with more features
%
% Inputs feat1,feat2 must be the same size
%
% Note: this function is only valid for Ex 5,since the degree is
% hard-coded in.
    degree = 6;
    out = ones(size(feat1(:,1)));
    for i = 1:degree
        for j = 0:i
            out(:,end+1) = (feat1.^(i-j)).*(feat2.^j);
        end
    end

Reference

  1. http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearning&doc=exercises/ex5/ex5.html
  2. http://www.jb51.cc/article/p-mujajogr-ut.html
  3. http://www.cnblogs.com/tornadomeet/archive/2013/03/17/2964515.html
@H_301_491@

猜你在找的正则表达式相关文章