Difference between revisions of "TensorFlow"

From Christoph's Personal Wiki
Jump to: navigation, search
(Introduction)
(Introduction)
Line 41: Line 41:
 
** Layers
 
** Layers
 
** Estimators
 
** Estimators
 +
 +
; Loss functions
 +
* Differentiatable functions that measure differences/error between true and predicted values
 +
* Common types:
 +
** [[:wikipedia:Mean squared error|Mean squared error]] (MSE)
 +
** [[:wikipedia:Cross_entropy#Cross-entropy_error_function_and_logistic_regression|Log loss]]
 +
** [[:wikipedia:Cosine similarity|Cosine distance]]
 +
** [[:wikipedia:Cross entropy|Cross entropy]]
 +
 +
; Optimizers
 +
* Optimizers are algorithms that minimize the loss (or error) of a model
 +
* Local minimum vs. global minimum
 +
* Built-in optimizers inherit from the Optimizer class
 +
* Common types:
 +
** Gradient descent
 +
** Adam
 +
** RMSProp
 +
** Adagrad
 +
** Momentum
 +
** Adadelta
 +
 +
; Layers
 +
* What are they?
 +
** Composed of tensors and operations forming the model
 +
** Generally connected in series
 +
** Pre-made functions for creating layers in a model
 +
* Common types:
 +
** Input
 +
** Convolutional (1d, 2d, 3d)
 +
** Pooling
 +
** Dropout
 +
** Dense
 +
 +
; Estimators
 +
* Training
 +
* Evaluation
 +
* Prediction
 +
* Build Graph
  
 
==References==
 
==References==

Revision as of 23:41, 29 April 2018

TensorFlow is an open-source software library for dataflow programming across a range of tasks. It is a symbolic math library, and is also used for machine learning applications such as neural networks.[1]

Introduction

Tensors
  • N-dimensional arrays
  • Measured by "rank"
  • All elements are same datatype
# Rank 0:
[1]
# Rank 1:
[1][2][3]
# Rank 2:
[1][2][3]
[4][5][6]
# Rank 3 (3D):
[1][2][3]
 [4][5][6]
  [7][8][9]
Tensor operations
  • Addition and subtraction
  • Multiplication and Division
  • Matrix multiplication
  • Dot product
  • Transpose
[1 2  3  4 ]   [1 5 9 ]T
|5 6  7  8 | = |2 6 10|
[9 10 11 12]   |3 7 11|
               [4 8 12]
TensorFlow building blocks
  • Lower level
    • Tensors
    • Operations
    • Graphs and sessions
  • Higher level
    • Loss functions
    • Optimizers
    • Layers
    • Estimators
Loss functions
Optimizers
  • Optimizers are algorithms that minimize the loss (or error) of a model
  • Local minimum vs. global minimum
  • Built-in optimizers inherit from the Optimizer class
  • Common types:
    • Gradient descent
    • Adam
    • RMSProp
    • Adagrad
    • Momentum
    • Adadelta
Layers
  • What are they?
    • Composed of tensors and operations forming the model
    • Generally connected in series
    • Pre-made functions for creating layers in a model
  • Common types:
    • Input
    • Convolutional (1d, 2d, 3d)
    • Pooling
    • Dropout
    • Dense
Estimators
  • Training
  • Evaluation
  • Prediction
  • Build Graph

References

  1. "TensorFlow: Open source machine learning" "It is machine learning software being used for various kinds of perceptual and language understanding tasks" — Jeffrey Dean, minute 0:47 / 2:17 from Youtube clip

External links