python - Tensorboard no graph -


i trying tensorboard visualize graph of network. below simple cnn code on mnist classification. code tensorboard tutorial.

code:

import os import tensorflow tf import urllib  gist_url = 'https://gist.githubusercontent.com/dandelionmane/4f02ab8f1451e276fea1f165a20336f1/raw/dfb8ee95b010480d56a73f324aca480b3820c180' logdir = '/tmp/mnist_tutorial/'  ### mnist embeddings ### mnist = tf.contrib.learn.datasets.mnist.read_data_sets(train_dir=logdir + 'data', one_hot=true)  # define simple convolutional layer def conv_layer(input, channels_in, channels_out):     w = tf.variable(tf.zeros([5, 5, channels_in, channels_out]))     b = tf.variable(tf.zeros([channels_out]))     conv = tf.nn.conv2d(input, w, strides=[1, 1, 1, 1], padding="same")     act = tf.nn.relu(conv + b)     return act  def fc_layer(input, channels_in, channels_out):     w = tf.variable(tf.zeros([channels_in, channels_out]))     b = tf.variable(tf.zeros([channels_out]))     act = tf.nn.relu(tf.matmul(input, w) + b)     return act  def make_hparam_string(learning_rate, use_two_fc, use_two_conv):   conv_param = "conv=2" if use_two_conv else "conv=1"   fc_param = "fc=2" if use_two_fc else "fc=1"   return "lr_%.0e,%s,%s" % (learning_rate, conv_param, fc_param)   # setup placeholders, , reshape data x = tf.placeholder(tf.float32, shape=[none, 784]) y = tf.placeholder(tf.float32, shape=[none, 10]) x_image = tf.reshape(x, [-1, 28, 28, 1]) # create network conv1 = conv_layer(x_image, 1, 32) pool1 = tf.nn.max_pool(conv1, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding="same") conv2 = conv_layer(pool1, 32, 64) pool2 = tf.nn.max_pool(conv2, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding="same") flattened = tf.reshape(pool2, [-1, 7 * 7 * 64]) fc1 = fc_layer(flattened, 7 * 7 * 64, 1024) logits = fc_layer(fc1, 1024, 10)    # compute cross entropy our loss function cross_entropy = tf.reduce_mean(       tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y)) # use adamoptimizer train network train_step = tf.train.adamoptimizer(1e-4).minimize(cross_entropy) # compute accuracy correct_prediction = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1)) accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))  sess = tf.session() # initialize variables sess.run(tf.global_variables_initializer()) hparam = make_hparam_string( .1,true,true)  writer = tf.summary.filewriter(logdir+hparam) writer.add_graph(sess.graph) # train 2000 steps in range(20):     batch = mnist.train.next_batch(100)     # report accuracy     if % 5 == 0:       [train_accuracy] = sess.run([accuracy], feed_dict={x: batch[0], y: batch[1]})       print("step %d, training accuracy %g" % (i, train_accuracy))     # run training step     sess.run(train_step, feed_dict={x: batch[0], y: batch[1]}) writer.close() 

graph not there! why? close writer also. (as has been mentioned in post there no graph tensorboard). not sure missing.

tensorboard:

$ tree mnist_tutorial/ mnist_tutorial/ ├── data │   ├── t10k-images-idx3-ubyte.gz │   ├── t10k-labels-idx1-ubyte.gz │   ├── train-images-idx3-ubyte.gz │   └── train-labels-idx1-ubyte.gz └── lr_1e-01,conv=2,fc=2     └── events.out.tfevents.1503327291.neon-2.local  2 directories, 5 files 

what should tensorboard logdir. assuming lr_1e-01,conv=2,fc=2 contains event file, , passed filewriter.

you using tensorflow windows version? try code below:

tf.train.write_graph(sess.graph_def, logdir+hparam, 'graph.pb', false)


Comments

Popular posts from this blog

ubuntu - PHP script to find files of certain extensions in a directory, returns populated array when run in browser, but empty array when run from terminal -

php - How can i create a user dashboard -

javascript - How to detect toggling of the fullscreen-toolbar in jQuery Mobile? -