ChatGPT解决这个技术问题 Extra ChatGPT

How do I use the Tensorboard callback of Keras?

I have built a neural network with Keras. I would visualize its data by Tensorboard, therefore I have utilized:

keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,
                            write_graph=True, write_images=True)

as explained in keras.io. When I run the callback I get <keras.callbacks.TensorBoard at 0x7f9abb3898>, but I don't get any file in my folder "Graph". Is there something wrong in how I have used this callback?

I would suggest setting histogram_freq to 1. "histogram_freq: frequency (in epochs) at which to compute activation histograms for the layers of the model. If set to 0, histograms won't be computed."
Be careful: "/Graph" makes a directory in the root directory, while "./Graph" makes one in the working directory.
@MattKleinsmith If set to 0, only activation and weight histograms for the layers of the model won't be computed via Validation data, metrics still will be logged.
I think it's better to give unique name to logdir look at stackoverflow.com/a/54949146/1179925

E
Eric O Lebigot
keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

This line creates a Callback Tensorboard object, you should capture that object and give it to the fit function of your model.

tbCallBack = keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)
...
model.fit(...inputs and parameters..., callbacks=[tbCallBack])

This way you gave your callback object to the function. It will be run during the training and will output files that can be used with tensorboard.

If you want to visualize the files created during training, run in your terminal

tensorboard --logdir path_to_current_dir/Graph 

Hope this helps !


I used this with following error when write_images=False
InvalidArgumentError (see above for traceback): Tensor must be 4-D with last dim 1, 3, or 4, not [1,3,3,256,256,1] [[Node: conv_3.2_2/kernel_0_1 = ImageSummary[T=DT_FLOAT, bad_color=Tensor, max_images=3, _device="/job:localhost/replica:0/task:0/cpu:0"](conv_3.2_2/kernel_0_1/tag, ExpandDims_50)]]
And something saying placeholder is missing dtype = float when True Any Idea?
The Scalars tab is still empty, although I can see my model architecture on the Graphs tab?
this only produces scalars for training loss & accuracy. how do you do the same for the validation_data which is passed to the fit function?
t
today

This is how you use the TensorBoard callback:

from keras.callbacks import TensorBoard

tensorboard = TensorBoard(log_dir='./logs', histogram_freq=0,
                          write_graph=True, write_images=False)
# define model
model.fit(X_train, Y_train,
          batch_size=batch_size,
          epochs=nb_epoch,
          validation_data=(X_test, Y_test),
          shuffle=True,
          callbacks=[tensorboard])

Is there a way to structure the output of tensorboard better? Does Keras do some optimization in that regard?
@nickpick I don't know what you mean. But I think this might be a candidate for another question.
important to note is that histogram_freq=0 is set if tensorboard doesn't log any histogram by tf.summary.histogram - otherwise histogram_freq does NOT equal 0!
L
Leandro Souza

Change

keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

to

tbCallBack = keras.callbacks.TensorBoard(log_dir='Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

and set your model

tbCallback.set_model(model)

Run in your terminal

tensorboard  --logdir Graph/

I got AttributeError: 'TensorBoard' object has no attribute 'set_model'.
A
Andrew

If you are working with Keras library and want to use tensorboard to print your graphs of accuracy and other variables, Then below are the steps to follow.

step 1: Initialize the keras callback library to import tensorboard by using below command

from keras.callbacks import TensorBoard

step 2: Include the below command in your program just before "model.fit()" command.

tensor_board = TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)

Note: Use "./graph". It will generate the graph folder in your current working directory, avoid using "/graph".

step 3: Include Tensorboard callback in "model.fit()".The sample is given below.

model.fit(X_train,y_train, batch_size=batch_size, epochs=nb_epoch, verbose=1, validation_split=0.2,callbacks=[tensor_board])

step 4 : Run your code and check whether your graph folder is there in your working directory. if the above codes work correctly you will have "Graph" folder in your working directory.

step 5 : Open Terminal in your working directory and type the command below.

tensorboard --logdir ./Graph

step 6: Now open your web browser and enter the address below.

http://localhost:6006

After entering, the Tensorbaord page will open where you can see your graphs of different variables.


important to note is that histogram_freq=0 is set if tensorboard doesn't log any histogram by tf.summary.histogram - otherwise histogram_freq does NOT equal 0!
K
Kings85

Here is some code:

K.set_learning_phase(1)
K.set_image_data_format('channels_last')

tb_callback = keras.callbacks.TensorBoard(
    log_dir=log_path,
    histogram_freq=2,
    write_graph=True
)
tb_callback.set_model(model)
callbacks = []
callbacks.append(tb_callback)

# Train net:
history = model.fit(
    [x_train],
    [y_train, y_train_c],
    batch_size=int(hype_space['batch_size']),
    epochs=EPOCHS,
    shuffle=True,
    verbose=1,
    callbacks=callbacks,
    validation_data=([x_test], [y_test, y_test_coarse])
).history

# Test net:
K.set_learning_phase(0)
score = model.evaluate([x_test], [y_test, y_test_coarse], verbose=0)

Basically, histogram_freq=2 is the most important parameter to tune when calling this callback: it sets an interval of epochs to call the callback, with the goal of generating fewer files on disks.

So here is an example visualization of the evolution of values for the last convolution throughout training once seen in TensorBoard, under the "histograms" tab (and I found the "distributions" tab to contain very similar charts, but flipped on the side):

https://raw.githubusercontent.com/Vooban/Hyperopt-Keras-CNN-CIFAR-100/master/tensorboard_histogram_example.png

In case you would like to see a full example in context, you can refer to this open-source project: https://github.com/Vooban/Hyperopt-Keras-CNN-CIFAR-100


I downvoted this because a large part of this is actually questions and not an answer to the question. Don't ask new questions in answers, whether it is a part or the entire purpose of an answer.
I edited the question to remove what you mentionned. In fact, this callback is very hard to use properly from the documentation at the time I answered.
To answer "How do I use the TensorBoard callback of Keras?", all the other answers are incomplete and respond only to the small context of the question - no one tackles embeddings for example. At least, I had documented potential errors or things to avoid in my answer. I think I raised important questions that no one even deems to think about yet. I am still waiting for for a complete answer. This callback is ill-documented, too, like cancer.
D
DINA TAKLIT

If you are using google-colab simple visualization of the graph would be :

import tensorboardcolab as tb

tbc = tb.TensorBoardColab()
tensorboard = tb.TensorBoardColabCallback(tbc)


history = model.fit(x_train,# Features
                    y_train, # Target vector
                    batch_size=batch_size, # Number of observations per batch
                    epochs=epochs, # Number of epochs
                    callbacks=[early_stopping, tensorboard], # Early stopping
                    verbose=1, # Print description after each epoch
                    validation_split=0.2, #used for validation set every each epoch
                    validation_data=(x_test, y_test)) # Test data-set to evaluate the model in the end of training

r
rsc

Create the Tensorboard callback:

from keras.callbacks import TensorBoard
from datetime import datetime
logDir = "./Graph/" + datetime.now().strftime("%Y%m%d-%H%M%S") + "/"
tb = TensorBoard(log_dir=logDir, histogram_freq=2, write_graph=True, write_images=True, write_grads=True)

Pass the Tensorboard callback to the fit call:

history = model.fit(X_train, y_train, epochs=200, callbacks=[tb])

When running the model, if you get a Keras error of

"You must feed a value for placeholder tensor"

try reseting the Keras session before the model creation by doing:

import keras.backend as K
K.clear_session()

It fixed the issue, You must feed a value for placeholder tensor. Any idea why?
P
Part

You wrote log_dir='/Graph' did you mean ./Graph instead? You sent it to /home/user/Graph at the moment.


Why would /Graph create a folder in the user's home directory instead of just using /Graph directly?
n
nicodjimenez

You should check out Losswise (https://losswise.com), it has a plugin for Keras that's easier to use than Tensorboard and has some nice extra features. With Losswise you'd just use from losswise.libs import LosswiseKerasCallback and then callback = LosswiseKerasCallback(tag='my fancy convnet 1') and you're good to go (see https://docs.losswise.com/#keras-plugin).


Disclaimer: OP is the founder of Losswise, which is a paid product (although with a pretty generous free tier)
@MichaelMior is correct, although it isn't a paid product yet and may never be (other than on prem licenses in the future maybe)
M
Michael Mior

There are few things.

First, not /Graph but ./Graph

Second, when you use the TensorBoard callback, always pass validation data, because without it, it wouldn't start.

Third, if you want to use anything except scalar summaries, then you should only use the fit method because fit_generator will not work. Or you can rewrite the callback to work with fit_generator.

To add callbacks, just add it to model.fit(..., callbacks=your_list_of_callbacks)


关注公众号,不定期副业成功案例分享
Follow WeChat

Success story sharing

Want to stay one step ahead of the latest teleworks?

Subscribe Now