This is a simplified interface for TensorFlow, to get people started on predictive analytics and data mining.
Library covers variety of needs from linear models to Deep Learning applications like text and image understanding.
Optionally you can install Scikit Learn and Pandas for additional functionality.
Then you can simply import learn
via from tensorflow.contrib.learn
or use tf.contrib.learn
.
Below are few simple examples of the API. For more examples, please see examples.
It‘s useful to re-scale dataset before passing to estimator to 0 mean and unit standard deviation. Stochastic Gradient Descent doesn’t always do the right thing when variable are very different scale.
Categorical variables should be managed before passing input to the estimator.
Simple linear classification:
import tensorflow.contrib.learn.python.learn as learn from sklearn import datasets, metrics iris = datasets.load_iris() classifier = learn.LinearClassifier(n_classes=3) classifier.fit(iris.data, iris.target, steps=200, batch_size=32) score = metrics.accuracy_score(iris.target, classifier.predict(iris.data)) print("Accuracy: %f" % score)
Simple linear regression:
import tensorflow.contrib.learn.python.learn as learn from sklearn import datasets, metrics, preprocessing boston = datasets.load_boston() x = preprocessing.StandardScaler().fit_transform(boston.data) regressor = learn.LinearRegressor() regressor.fit(x, boston.target, steps=200, batch_size=32) score = metrics.mean_squared_error(regressor.predict(x), boston.target) print ("MSE: %f" % score)
Example of 3 layer network with 10, 20 and 10 hidden units respectively:
import tensorflow.contrib.learn.python.learn as learn from sklearn import datasets, metrics iris = datasets.load_iris() classifier = learn.DNNClassifier(hidden_units=[10, 20, 10], n_classes=3) classifier.fit(iris.data, iris.target, steps=200, batch_size=32) score = metrics.accuracy_score(iris.target, classifier.predict(iris.data)) print("Accuracy: %f" % score)
Example of how to pass a custom model to the Estimator:
import tensorflow.contrib.learn.python.learn as learn from sklearn import datasets, metrics iris = datasets.load_iris() def my_model(x, y): """This is DNN with 10, 20, 10 hidden layers, and dropout of 0.5 probability.""" layers = learn.ops.dnn(x, [10, 20, 10], dropout=0.5) return learn.models.logistic_regression(layers, y) classifier = learn.Estimator(model_fn=my_model, n_classes=3) classifier.fit(iris.data, iris.target) score = metrics.accuracy_score(iris.target, classifier.predict(iris.data)) print("Accuracy: %f" % score)
Each estimator has a save
method which takes folder path where all model information will be saved. For restoring you can just call learn.Estimator.restore(path)
and it will return object of your class.
Some example code:
classifier = learn.LinearRegressor() classifier.fit(...) classifier.save('/tmp/tf_examples/my_model_1/') new_classifier = Estimator.restore('/tmp/tf_examples/my_model_2') new_classifier.predict(...)
To get nice visualizations and summaries you can use logdir
parameter on fit
. It will start writing summaries for loss
and histograms for variables in your model. You can also add custom summaries in your custom model function by calling tf.summary
and passing Tensors to report.
classifier = learn.LinearRegressor() classifier.fit(x, y, logdir='/tmp/tf_examples/my_model_1/')
Then run next command in command line:
tensorboard --logdir=/tmp/tf_examples/my_model_1
and follow reported url.
Graph visualization: |Text classification RNN Graph|
Loss visualization: |Text classification RNN Loss|
See examples folder for: