机器人与人工智能爱好者论坛

标题: 问下tensorflow的logistic_regression例子console输出的数据是啥意思? [打印本页]

作者: guotong1988    时间: 2015-12-26 20:41
标题: 问下tensorflow的logistic_regression例子console输出的数据是啥意思?
0 0.8853
1 0.8972
2 0.9036
3 0.9076
4 0.9095
5 0.9105
6 0.912
7 0.9133
8 0.9146
9 0.9149
10 0.9158
11 0.9169
12 0.9168
13 0.917
14 0.9173
15 0.9176
16 0.918
17 0.9187
18 0.9192
19 0.919
20 0.9193
21 0.9199
22 0.9201
23 0.9204
24 0.9204
25 0.9206
26 0.9206
27 0.9205
28 0.9204
29 0.9206
30 0.9208
31 0.9213
32 0.921
33 0.9209
34 0.9211
35 0.9215
36 0.9213
37 0.9212
38 0.9215
39 0.9217
40 0.9219
41 0.9218
42 0.9218
43 0.9217
44 0.9216
45 0.9216
46 0.9218
47 0.922
48 0.9219
49 0.9219
50 0.9219
51 0.9218
52 0.922
53 0.922
54 0.9222
55 0.9224
56 0.9228
57 0.9227
58 0.9226
59 0.9228
60 0.9231
61 0.9232
62 0.9235
63 0.9235
64 0.9235
65 0.9236
66 0.9236
67 0.9236
68 0.9236
69 0.9235
70 0.9234
71 0.9234
72 0.9234
73 0.9235
74 0.9234
75 0.9234
76 0.9235
77 0.9236
78 0.9236
79 0.9236
80 0.9237
81 0.9237
82 0.9239
83 0.9239
84 0.9237
85 0.9236
86 0.9237
87 0.9235
88 0.9235
89 0.9233
90 0.9232
91 0.9231
92 0.9231
93 0.9231
94 0.9231
95 0.9232
96 0.9232
97 0.9234
98 0.9233
99 0.9234

作者: morinson    时间: 2015-12-26 22:22
说详细点,你跑的调用代码是啥?
作者: guotong1988    时间: 2015-12-27 10:42
import tensorflow as tf
import numpy as np
import input_data


def init_weights(shape):
    return tf.Variable(tf.random_normal(shape, stddev=0.01))


def model(X, w):
    return tf.matmul(X, w) # notice we use the same model as linear regression, this is because there is a baked in cost function which performs softmax and cross entropy


mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
trX, trY, teX, teY = mnist.train.images, mnist.train.labels, mnist.test.images, mnist.test.labels

X = tf.placeholder("float", [None, 784]) # create symbolic variables
Y = tf.placeholder("float", [None, 10])

w = init_weights([784, 10]) # like in linear regression, we need a shared variable weight matrix for logistic regression

py_x = model(X, w)

cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(py_x, Y)) # compute mean cross entropy (softmax is applied internally)
train_op = tf.train.GradientDescentOptimizer(0.05).minimize(cost) # construct optimizer
predict_op = tf.argmax(py_x, 1) # at predict time, evaluate the argmax of the logistic regression

sess = tf.Session()
init = tf.initialize_all_variables()
sess.run(init)

for i in range(100):
    for start, end in zip(range(0, len(trX), 128), range(128, len(trX), 128)):
        sess.run(train_op, feed_dict={X: trX[start:end], Y: trY[start:end]})
    print i, np.mean(np.argmax(teY, axis=1) ==
                     sess.run(predict_op, feed_dict={X: teX, Y: teY}))
作者: guotong1988    时间: 2015-12-27 10:44
谢谢谢谢
作者: morinson    时间: 2015-12-27 22:37
guotong1988 发表于 2015-12-27 10:42
import tensorflow as tf
import numpy as np
import input_data

我高数很差,所以也看的不是很懂。

但代码大意好像是
获取MNIST样本数据,一组训练数据(trX,trY),一组测试数据(teX,teY)
在100次循环(0~99)中,每次先进行数据训练,然后输出这时的用测试数据测试时判断准确的概率

你可以看到随着训练量的增加,概率也在增加。




欢迎光临 机器人与人工智能爱好者论坛 (http://robot-ai.org/) Powered by Discuz! X3.2