# PyTorch实现Logistic回归对多元高斯分布进行分类实战（附源码）

Logistic常用于解决二分类问题，为了便于描述，我们分别从两个多元高斯分布中生成数据X1，X2.这两个多元高斯分布分别表示两个类别，分别设置其标签为y1，y2.

# 优化算法

Logistic回归通常采用梯度下降法优化目标函数，PyTorch的torch.optim包实现了大多数常用的优化算法，使用起来非常简单，首先构建一个优化器，在构建时，首先需要将待学习的参数传入，然后传入优化器需要的参数，比如学习率等等

部分源码如下

``````import self as self
import  torch
from cv2.ml import LogisticRegression
from torch import  nn
from matplotlib import  pyplot as plt
import  numpy as np
from torch.distributions import  MultivariateNormal
mu1=-3*torch.ones(2)
mu2=3*torch.ones(2)
sigma1=torch.eye(2)*0.5
sigma2=torch.eye(2)*2

x1=m1.sample((100,))
x2=m2.sample((100,))

y=torch.zeros((200,1))
y[100:]=1

x=torch.cat([x1,x2],dim=0)
idx=np.random.permutation(len(x))
x=x[idx]
y=y[idx]

plt.scatter(x1.numpy()[:,0],x1.numpy()[:,1])
plt.scatter(x2.numpy()[:,0],x2.numpy()[:,1])
plt.show()
D_in,D_out=2,1
linear=nn.Linear(D_in,D_out,bias=True)
output=linear(x)
print(x.shape,linear.weight.shape,linear.bias.shape,output.shape)

def my_linear(x,w,b):
print(torch.sum((output-my_linear(x,linear.weight,linear.bias))))
sigmoid=nn.Sigmoid()
scores=sigmoid(output)
def my_sigmoid(x):
x=1/(1+torch.exp(-x))
return x
loss=nn.BCELoss()
loss(sigmoid(output),y)
def my_loss(x,y):
loss=-torch.mean(torch.log(x)*y+torch.log(1-x)*(1-y))
return loss
from torch import  optim
import torch.nn as nn
class LogisticRegression(nn.Module):
super(LogisticRegression,self).__init__()
self.linear=nn.Linear()
optimizer=optim.SGD(lr=0.03)
batch_size=10
iters=10
for _ in range(iters):
for i in range(int(len(x)/batch_size)):
input=x[i*batch_size:(i+1)*batch_size]
target=y[i*batch_size:(i+1)*batch_size]