Introduction to the Tf. summary module
In TensorFlow, there are three most commonly used visualization methods: the mixed programming of TensorFlow and OpenCv, the visualization using Matpltlib, and the visualization tool TensorBoard of TensorFlow. All three methods have been covered in some detail in previous blogs. However, the most important visualization method in TensorFlow is accomplished through the cooperation of the three modules tensorBoard, TF.summary, and TF.summary.FileWriter.
The definition of the Tf.summary module is in the summary.py file, which mainly defines various functions to be used in visualization. The main functions contained in tf.summary are as follows:
from __future__ import absolute_import
from __future__ import pision
from __future__ import print_function
from google.protobuf import json_format as _json_format
from tensorflow.core.framework.summary_pb2 import Summary
from tensorflow.core.framework.summary_pb2 import SummaryDescription
from tensorflow.core.util.event_pb2 import Event
from tensorflow.core.util.event_pb2 import SessionLog
from tensorflow.core.util.event_pb2 import TaggedRunMetadata
from tensorflow.python.eager import context as _context
from tensorflow.python.framework import dtypes as _dtypes
from tensorflow.python.framework import ops as _ops
from tensorflow.python.ops import gen_logging_ops as _gen_logging_ops
from tensorflow.python.ops import summary_op_util as _summary_op_util
from tensorflow.python.ops.summary_ops import tensor_summary
from tensorflow.python.summary.text_summary import text_summary as text
from tensorflow.python.summary.writer.writer import FileWriter
from tensorflow.python.summary.writer.writer_cache import FileWriterCache
from tensorflow.python.util import compat as _compat
from tensorflow.python.util.all_util import remove_undocumented
from tensorflow.python.util.tf_export import tf_export
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Module Description:
Main functions included in # tf.summary
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
def scalar(name, tensor, collections=None, family=None)
def image(name, tensor, max_outputs=3, collections=None, family=None)
def histogram(name, values, collections=None, family=None)
def audio(name, tensor, sample_rate, max_outputs=3, collections=None,family=None)
def merge(inputs, collections=None, name=None)
def merge_all(key=_ops.GraphKeys.SUMMARIES, scope=None)
def get_summary_description(node_def)
Copy the code
2. Description of common functions in TF. summary module:
1 description of TF.summary. Scalar function
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
# function prototype:
# def scalar(name, tensor, collections=None, family=None)
# function description:
# [1] Outputs a Summary Protocol buffer containing scalar values, which is a structured data format that can be parsed by the TensorBoard module.
# [2] used to display scalar information
# [3] used to visualize scalar information
# [4] In fact, all summmary operations in Tensorflow are a single summary protocol buffer generated by a tensor in a calculated diagram
Summary Protocol Buffer is another structured data format that can be parsed and visualized by Tensorboard
# Although the above four explanations may be formal, I find them difficult to understand, so I will interpret the function of tF.summary.scalar () as follows:
# [1] Write the scalar data in the calculation graph to the log file in TensorFlow to prepare for the future visualization of tensorboard
# Parameter description:
# [1]name: the name of a node, as shown in the red box below
# [2]tensor: You need to visualize your numbers
Main uses:
# This function is typically used when plotting loss and Accuary curves.
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Copy the code
Summary: This function is typically used when plotting loss and Accuary curves
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
# function description:
Generate the monitoring information of [variable] and write the generated monitoring information to [log file]
# Parameter description:
# [1]var: tensors that need to monitor and record running state
# [2]name: Gives the name of the chart displayed in the visualization result
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
def variable_summaries(var,name):
with tf.name_scope('summaries') :#【1】 Through tf.summary.histogram()
tf.summary.histogram(name,var)
mean = tf.reduce_mean(var)
tf.summary.scalar('mean/'+name,mean)
stddev = tf.sqrt(tf.reduce_mean(tf.square(var-mean)))
tf.summary.scalar('stddev/'+name,stddev)
Copy the code
2 Description of tf.summary.image function
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
# function prototype:
# def image(name, tensor, max_outputs=3, collections=None, family=None)
# function description:
# [1] Outputs a summary containing the image constructed from a 4-dimensional tensor with the four dimensions shown below:
# [batch_size,height, width, channels]
# [2] Where the parameter channels has three values:
# [1] tensor is interpreted as Grayscale
# [2] 10: tensor is interpreted as RGB
# [3] tensor is interpreted as a RGBA 4-channel image
# [3] All images input to this function must have the same size (length, width, channel, data type) and the data type must be uint8 i.e. all pixel values in
# [0,255] the range
# Although the above three explanations may be formal, I feel that they are not well understood. Therefore, I interpret the function of tf.summary.image() as:
# [1] Write the image data from the calculation graph to the log file in TensorFlow to prepare for the future visualization of the Tensorboard
#
# Parameter description:
# [1]name: the name of a node, as shown in the red box below
# [2]tensor: You need to visualize the image data, a four-dimensional tensor with elements of type Uint8 or Float32 and dimensions of [batch_size, height,
# width, channels]
# [3]max_outputs: The number of output channels, which can be understood in combination with the following example code
Main uses:
# Generally used in neural network image visualization
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Copy the code
Summary: Sample code for visualization of images used in neural networks is shown below:
def main(argv=None):
# [1] Load data from disk
mnist = input_data.read_data_sets('F:/MnistSet/',one_hot=True)
# [2] Define two [placeholders] as the input variables of [training sample picture/this block sample exists as feature vector] and [category label], and store these placeholders in the namespace input
with tf.name_scope('input'):
x = tf.placeholder('float', [None, 784],name='x-input')
y_ = tf.placeholder('float', [None, 10], name='y-input')
# [2] Restore [input feature vector] to [image pixel matrix], and define the current image information as an operation to write to the log through tF.summary. image function
with tf.name_scope('input_reshape'0 image_SHAped_input = tf.0 (x,[-1,28,28,1]) tf.Summary. Image (0'input',image_shaped_input,10)
Copy the code
3 TF.summary. Histogram Function Description
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
# function prototype:
# def histogram(name, values, collections=None, family=None)
# function description:
# [1] is used to display histogram information
# [2] Add a summary of the histogram, which can be used to visualize the distribution of your data. More specific information about the histogram in TensorBoard can be found in
# of the following links to https://www.tensorflow.org/programmers_guide/tensorboard_histograms
#
# Although the above two explanations may be relatively formal, they are not well understood by me. Therefore, I interpret the function of tF.summary.histogram () as follows:
# [1] Write the data distribution/data histogram from the calculation graph to the log file in TensorFlow to prepare for the future visualization of tensorboard
# Parameter description:
# [1]name: the name of a node, as shown in the red box below
# [2] VALUES: The data to be visualized can be of any shape or size
Main uses:
It is generally used to display the distribution of variables during training
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Copy the code
Summary: Generally used to display the distribution of variables in the training process example code is as follows:
# function description:
Generate a full-connection layer neural network
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
def nn_layer(input_tensor,input_dim,output_dim,layer_name,act=tf.nn.relu):
with tf.name_scope(layer_name):
with tf.name_scope('weights'Fullweights = tf.variable (tf.truncated_normal([input_DIM,output_dim],stddev=0.1)) Variable_summaries (fullweights,layer_name+'/weights')
with tf.name_scope('biases'Dependent (tf.constant(0.0,shape=[output_dim])) Variable_fulllink (biases,layer_name+'/biases')
with tf.name_scope('Wx_plus_b'):
preactivate = tf.matmul(input_tensor,weights)+biases
tf.summary.histogram(layer_name+'/pre_activvations',preactivate)
activations = act(preactivate,name='activation')
tf.summary.histogram(layer_name+'/activations',activations)
return activations
Copy the code
4 Description of tf.summary.merge_all function
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
# function prototype:
# def merge_all(key=_ops.GraphKeys.SUMMARIES, scope=None)
# function description:
# [1] Consolidates all the summaries defined previously
# [2] Similar to other operations in TensorFlow, tF.summary. scalar, tF.summary. histogram and Tf.summary. image are also functions
# op, they will not be executed immediately when they are defined, so you need to explicitly call these functions via sess.run. For logging operations defined in a program
Tensorflow provides the tf.summary.merge_all() function to consolidate all the summaries into one
#. When the TensorFlow program is executed, it only needs to run this one operation to execute all of the logging operations defined in the code once, thus making it easy to log
All logs are written to the log file.
#
# Parameter description:
Typecheck: Specifies the GraphKey used to collect summaries. The default is graphkeys. summaries
# [2]scope: Optional
Copy the code
5 Description of the tf.summary.FileWriter class
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Class definition prototype:
# class FileWriter(SummaryToEventTransformer)
# class description:
# [1] Writes Summary protocol buffers to a disk file
The # [2]FileWriter class provides a mechanism for creating event files in a given directory and writing summary data to hard disk
# constructor:
# def __init__(self,logdir,graph=None,max_queue=10,flush_secs=120,graph_def=None,filename_suffix=None):
# Parameter description:
# [1]self: class object itself
# [2]logdir: directory used to store [log files]
# [3]graph: The computed graph to be stored
# sample application:
Summary_writer = tf.summary.fileWriter (SUMMARY_DIR,sess.graph) : Create a FileWrite class object that will evaluate the graph
Write file
Copy the code
Summary: The sample code for writing summary data to disk looks like this:
merged = tf.summary.merge_all()
# [8] Create a callback Session
with tf.Session() as sess:
# [9] Instantiate a FileWriter-class object and write the current TensoirFlow calculation graph to [log file]
summary_writer = tf.summary.FileWriter(SUMMARY_DIR,sess.graph)
# [10] Variables created in Tensorflow must be initialized before they can be used. The following is the initialization function
tf.global_variables_initializer().run()
# [11] Start training
for i in range(TRAIN_STEPS):
xs,ys = mnist.train.next_batch(BATCH_SIZE)
# [12] Run the training steps and all the log file generation operations to get the log file for this run.
summary,_,acc = sess.run([merged,train_step,accuracy],feed_dict={x:xs,y_:ys})
print('Accuracy at step %s: %s' % (i, acc))
# [13] By writing all logs to a file, TensorFlow can run the log file this time to visualize various information
summary_writer.add_summary(summary,i)
summary_writer.close()
Copy the code
6 Add_summary function description
# = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
# function prototype:
# def add_summary(self, summary, global_step=None)
# function description:
# [1] This function is a member of the tF.summary.fileWriter parent
# [2] Adds a 'Summary' protocol buffer to the event file and writes it to the event file
# Parameter description:
# [1]self: class object itself
# [2]summary: Summary to be written
# [3]graph: global_step, the number of rounds of the current iteration. Note that, without this parameter, Scalar's summary will be a straight line
# sample application:
# summary_writer.add_summary(summary,i)
Copy the code