preface

Only a bald head can be strong.

Welcome to our GitHub repository Star: github.com/ZhongFuChen…

To recap:

  • Learn TensorFlow from scratch [01- Build environment, HelloWorld]

What does TensorFlow mean? Tensor? The Flow? This article introduces some of the basics of TensorFlow.

Tensor introduction

Before we get there, one takeaway to keep in mind is that TensorFlow uses Tensor to represent data

And then we’ll see what Tensor is, Tensor is translated in the documentation on the website. It also gives a definition:

Tensors are generalizations of vectors and matrices to potentially higher dimensions, and TensorFlow internally represents tensors as n-dimensional arrays of basic data types.

I don’t know how you feel after reading this sentence, but I didn’t understand it at the time, what is a tensor? . Therefore, I went to zhihu inside with the keyword search: “what is a tensor”. I found a related question: “How to understand tensors in a popular way?”

  • www.zhihu.com/question/23…

I thought that with Zhihu, I would be able to understand what a tensor is, and give me a clear idea. Surprisingly, most of the respondents were answering questions about the definition of tensors in physics and mathematics, and then posted a bunch of formulas I didn’t understand. There is also a relatively straightforward definition:

A quantity that transforms according to a particular law in a different reference frame is a tensor.

Read all the answers to the Lord, read more abstract. Let’s go back to the official documentation, and look at the example of the official introduction of tensors, and it kind of makes sense.

So far we have two conclusions:

  • TensorFlow uses Tensor to represent data
  • TensorFlow internally represents tensors as n-dimensional arrays of primitive data types

At TensorFlow all the data is an N-dimensional array, but we call it Tensor

In the middle of a lot of torturing, in fact, the beginning of the conclusion and the official definition of the translation into their own think easier to understand… But a lot of times, learning is just a process.

The foundation of 1.1 Tensor

And we already know that the Tensor is actually an n-dimensional array. This extends a few terms:

  • Rank (rank)
  • The shape of

1.1.1 Order (rank)

In fact, the order is what we call the dimension.

  • For example, if we have a two-dimensional array, then the order is 2
  • Let’s say we have a three dimensional array, then this order is three

In the past, when writing Java, you may be generally exposed to two dimensions, but in machine learning, it is very likely to have high dimensions, so how do we count the dimensions? It’s easy. We just count the parentheses. For example, we might see the following array output:

[[[9 6]
  [6 9]
  [8 8]
  [7 9]]
 [[6 1]
  [3 5]
  [1 7]
  [9 4]]]
Copy the code

So let’s just go from the first parenthesis to the first number, and see how many parentheses there are. If there are three parentheses, this is a three-dimensional array, and its rank is 3

1.1.2 shape

The shape of tensors allows us to see the number of elements in each dimension.

If we create a two-dimensional array in Java: int[][] array = new int[3][4] But if we create a multi-dimensional array, it’s hard to describe just the rows and columns. Therefore, in TensorFlow we usually describe it as follows:

  • The number of elements in dimension one is three, and the number of elements in dimension two is four.
  • It still means the same thing after all, but the wording has changed.

If we were to print the shape of the array above, we would get shape = (3,4). Shape = (60000, 28, 28) Through shape, we can get some information:

  • The current array is three-dimensional
  • There are 60,000 elements in the first dimension
  • There are 28 elements in the second dimension
  • There are 28 elements in the third dimension

So if we get an array, how can we visually see its shape?

For example: m = [[1, 2, 3], [4, 5, 6], [7, 8, 9]], this is very simple, at a glance can see this is a two-dimensional array (matrix), three rows and three columns. So shape should be 3 comma 3.

Look at a: t = [[[2], [4], [6]], [[8], [10], [12]], [[14], [16], [18]]], we can see from the multiple brackets, this is three dimensional. Let’s remove the parentheses in the outer most results is [[2], [4], [6]], [[8], [10], [12]], [[14], [16], [18]]

Ok, so at this point, we can think of it as having three subarrays, so our shape can be written as shape of 3, right? ,?)

  • We know from the parentheses that it must be three-dimensional, so it must be(? ,? ,?). From the number of “subarrays” we divide the first “?” The number is filled with 3

[2], [4], [6], also have three elements, so we can fill shape(3,3,?).

Finally, we can remove the parentheses and see that there is only one element, so the result is shape(3,3,1).

We can take a look at the following figure to reinforce the above concept:

1.1.3 Tensor data type

TensorFlow internally represents tensors as n-dimensional arrays of primitive data types, yes. In an array, we always know what type of data we’re storing.

  • We can serialize any data structure tostringAnd store it intf.TensorIn the. throughtf.castCan betf.TensorTransition from one data type to another.

Tensor’s data type is as follows:

2. Special tensors

Special tensors are as follows:

  • Tf. The Variable – the Variable
  • Tf. Constant – constants
  • Tf. The placeholder – placeholder
  • Tf. SparseTensor – SparseTensor

This time, we’ll start with the first three (easier to understand) : variables, constants, and placeholders.

2.1 constant

A constant is what a constant means, and it will not change once it is created. (I believe you can understand)

In TensorFlow, the way to create constants is simple:

a = tf.constant(2)
b = tf.constant(3)
Copy the code

2.2 variable

Variables are easy to understand (just compare the concept of a programming language to this one). Generally speaking, our parameters in the training process are usually stored in variables, because our parameters will constantly change.

There are two ways to create variables in TensorFlow:


# 1. Create using the Variable class
The # tf.random_normal method returns a tensor of shape (1,4). Its four elements conform to a normal distribution with a mean of 100 and a standard deviation of 0.35.
W = tf.Variable(initial_value=tf.random_normal(shape=(1.4), mean=100, stddev=0.35), name="W")
b = tf.Variable(tf.zeros([4]), name="b")

# 2. Create using get_variable
my_int_variable = tf.get_variable("my_int_variable"[1.2.3], dtype=tf.int32,
  initializer=tf.zeros_initializer)

Copy the code

Note that once we have created a variable, we need to initialize it every time we use it!

tf.global_variables_initializer()
Copy the code

2.3 a placeholder

I first encountered the concept of placeholders in JDBC. Since SQL requires parameters passed in to determine, we might write an SQL statement like this: select * from user where id =?

Similarly, in TensorFlow placeholders are a concept that may need to wait until run time to determine some variables, so we have placeholders.

Using placeholders in TensorFlow is also simple:

The file name is not determined until run time
train_filenames = tf.placeholder(tf.string, shape=[None])

#.. Leave out a bunch of details
At runtime, the placeholder value is specified using feed_dict
feed_dict={train_filenames: training_filenames}
Copy the code

All of this is true in programming languages, but the syntax has changed.

Third, the Flow? Describes diagrams and nodes

We translate Flow into Chinese, Flow, so now we have Tensor Flow, right?

In fact, in TensorFlow, graphs are used to represent computational tasks. By default, TensorFlow gives us a blank graph, which we call a “data flow graph”. A flow diagram is made up of directed edges and nodes, and when we use TensorFlow we create all sorts of nodes in the diagram, and the Tensor flows through those nodes. So, it’s called TensorFlow

So one wonders, what do we do to create a node? In TensorFlow, there are three types of nodes:

  • Storage node: Stateful variable operations, typically used to store model parameters
  • Compute node: stateless computing and control operations responsible for controlling the logic or flow of an algorithm
  • Data node: A placeholder operation for data that describes data input from outside the diagram

TensorFlow generates a node for variables, constants, and placeholders. For this kind of Operation, we will simply call it op

Therefore, op is simply an operation performed in TensorFlow (it may be a variable creation operation, or it may be a calculation operation). The common op in TensorFlow is the following:

In plain English TensorFlow gives us a blank data flow graph, which we populate (create nodes) to achieve the desired effect.

  • Start a map, all rely on editing content!

Let’s take a look at the official GIF showing the data flow diagram to give a better impression.

  • TensorFlow uses data flow diagrams to represent computational tasks
  • TensorFlow uses the Tensor to represent the data, the Tensor flows through the data flow diagram.
  • In TensorFlow, “node creation, operation” and other activities are collectively called OP

What is session?

TensorFlow programs are typically organized into a construction phase and an execution phase. In the construction phase, the op’s execution steps are described as a diagram. In the execution phase, the op in the diagram is executed using the session execution.

  • Note: because it is a directed edge, the current node cannot operate until the state of the previous entry nodes is completed.

In plain English, when we write code, we are actually describing the blank graph TensorFlow gives us as the graph we want. But if we want to run the result of the graph, we have to run it through the session.

Here’s a quick example:

import tensorflow as tf
Y = W * x + b, where W and B are storage nodes and x is data node.
x = tf.placeholder(tf.float32)
W = tf.Variable(1.0)
b = tf.Variable(1.0)
y = W * x + b

# ========= If you don't use session to run, the above code is just a graph. We run the graph through session to get the desired result
with tf.Session() as sess:
    tf.global_variables_initializer().run() # Operation.run
    fetch = y.eval(feed_dict={x: 3.0})      # Tensor.eval
    print(fetch)   # fetch = 1.0 * 3.0 + 1.0
Copy the code

4.1 What is Fetch?

Fetch can then put in a tensor at session.run and then send back a tensor.

4.2 tensor. The eval () and Operation. The run ()

Some of you, when you look up your files, you might call not session.run, but tensor.eval() and operation.run (). In fact, their last call is session.run. The difference is that session.run can return multiple tensor at once (via Fetch).

The last

I once saw a paragraph summed up well:

  • Use tensor to represent data.
  • Use graphs to represent computational tasks.
  • Run diagrams in sessions
  • throughVariable (Variable)Maintain status.

TensorFlow is a programming system that uses graphs to represent computational tasks. The nodes in the diagram are called op (short for operation). An OP gets zero or more Tensor, does the calculations, produces zero or more Tensor. Each Tensor is a typed multidimensional array.

This article briefly explains what TensorFlow means and some basic concepts. But I’m just briefly describing some common concepts in the way I understand them. There are a lot of things to know (such as what parameters to specify when creating variables… .). , these are handed over to the official website, blogs, books to learn.

I believe that as long as you understand these concepts, you can get twice the result with half the effort!

Stay tuned for the next TensorFlow article

References:

  • Juejin. Cn/post / 684490…
  • Github.com/geektime-ge…

Happy to export dry Java technology public number: Java3y. There are more than 200 original technical articles, massive video resources, beautiful brain map, might as well pay attention to it!

Think my article is written well, might as well click a thumbs-up!