Edited, memorised or added to reading queue

on 26-Jun-2025 (Thu)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 7674299026700

Tags
#tensorflow #tensorflow-certificate
Question

loss function:

  • In case of [...(loss function?)] the labels have to be one-hot encoded

Answer
categorical_crossentropy

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
loss function: In case of categorical_crossentropy the labels have to be one-hot encoded

Original toplevel document

TfC_02_classification-PART_2
y-axis -> true label x-axis -> predicted label # Create confusion metrics from sklearn.metrics import confusion_matrix y_preds = model_8.predict(X_test) confusion_matrix(y_test, y_preds) <span>important: This time there is a problem with loss function. In case of categorical_crossentropy the labels have to be one-hot encoded In case of labels as integeres use SparseCategoricalCrossentropy # Get the patterns of a layer in our network weights, biases = model_35.layers[1].get_weights() <span>







Flashcard 7710425353484

Tags
#tensorflow #tensorflow-certificate
Question

Bag of tricks to improve model

3. Fit the model - more [...], more data examples

Answer
epochs

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Bag of tricks to improve model 3. Fit the model - more epochs, more data examples

Original toplevel document

TfC_02_classification-PART_1
nse(10, activation='relu'), tf.keras.layers.Dense(1, activation='sigmoid') ]) model.compile(optimizer=tf.keras.optimizers.Adam(), loss=tf.keras.losses.binary_crossentropy, metrics=['accuracy']) <span>Bag of tricks to improve model Create model - more layers, more neurons, different activation Compile mode - other loss, other optimizer, change optimizer parameters Fit the model - more epochs, more data examples # plots model predictions agains true data import numpy as np def plot_decision_boundry(model, X, y): """ Take in a trained model, features and labels and create numpy.meshgrid of the d







Flashcard 7710427450636

Tags
#RNN #ariadne #behaviour #consumer #deep-learning #priority #recurrent-neural-networks #retail #simulation #synthetic-data
Question
To achieve better explainability, in many e-commerce applications consumer behavior can be viewed on the level of [...].
Answer
sessions

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
To achieve better explainability, in many e-commerce applications consumer behavior can be viewed on the level of sessions.

Original toplevel document (pdf)

cannot see any pdfs