英文:
Why is predict_on_batch is repeating the first output over and over?
问题
以下是代码的翻译部分:
import tensorflow as tf
import numpy as np
x = np.array([[1, 0, 0], #x1
[1, 0.5, 0], #x2
[1, 1, 0], #x3
[0.5, 1, 0], #x4
[0, 1, 0], #x5
[0, 1, 0.5], #x6
[0, 1, 1], #x7
[0, 0.5, 1], #x8
[0, 0, 1], #x9
[0.5, 0, 1], #x10
[1, 0, 1], #x11
[1, 0, 1]]) #x12
#Key:
#[red, orange, yellow, green, blue, purple]
y = np.array([[1,0,0,0,0,0], #y1
[0,1,0,0,0,0], #y2
[0,0,1,0,0,0], #y3
[0,0,0,1,0,0], #y4
[0,0,0,1,0,0], #y5
[0,0,0,1,0,0], #y6
[0,0,0,0,1,0], #y7
[0,0,0,0,1,0], #y8
[0,0,0,0,1,0], #y9
[0,0,0,0,0,1], #y10
[0,0,0,0,0,1], #y11
[1,0,0,0,0,0]]) #y12
# 定义模型
model = tf.keras.models.Sequential()
model.add(tf.keras.Input(shape=(3)))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(6, activation=tf.keras.activations.sigmoid))
# 编译模型
model.compile(tf.keras.optimizers.Adam(learning_rate=0.1), "BinaryCrossentropy", metrics=['binary_accuracy'])
model.summary()
history = model.fit(x, y, batch_size=1, epochs=500)
predictions = model.predict_on_batch(x)
print(predictions)
希望这有助于您理解您的代码。如果您需要任何进一步的协助,请随时告诉我。
英文:
I honestly don't know how to describe my problem. Basically, it only uses the first values in my tables and repeats it over and over for the rest of the values in the table. Like I said, just look at the output.
Here's my code:
import tensorflow as tf
import numpy as np
x = np.array([[1, 0, 0], #x1
[1, 0.5, 0], #x2
[1, 1, 0], #x3
[0.5, 1, 0], #x4
[0, 1, 0], #x5
[0, 1, 0.5], #x6
[0, 1, 1], #x7
[0, 0.5, 1], #x8
[0, 0, 1], #x9
[0.5, 0, 1], #x10
[1, 0, 1], #x11
[1, 0, 1]]) #x12
#Key:
#[red, orange, yellow, green, blue, purple]
y = np.array([[1,0,0,0,0,0], #y1
[0,1,0,0,0,0], #y2
[0,0,1,0,0,0], #y3
[0,0,0,1,0,0], #y4
[0,0,0,1,0,0], #y5
[0,0,0,1,0,0], #y6
[0,0,0,0,1,0], #y7
[0,0,0,0,1,0], #y8
[0,0,0,0,1,0], #y9
[0,0,0,0,0,1], #y10
[0,0,0,0,0,1], #y11
[1,0,0,0,0,0]]) #y12
# Define the model
model = tf.keras.models.Sequential()
model.add(tf.keras.Input(shape=(3)))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(6, activation=tf.keras.activations.sigmoid))
# Compile the model
model.compile(tf.keras.optimizers.Adam(learning_rate=0.1), "BinaryCrossentropy", metrics=[ 'binary_accuracy'])
model.summary()
history = model.fit(x, y, batch_size=1, epochs=500)
predictions = model.predict_on_batch(x)
print(predictions)
Here is the output:
[[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]
[0.16287932 0.07145664 0.08749434 0.26094046 0.25779992 0.16714773]]
Here is the link to the Replit cover page for more details:
https://replit.com/@EthanKantala/Color-Guesser-Tensorflow?v=1
I have tried adding more neurons and doing some research. I honestly have no idea what to do. Thanks for any help!
答案1
得分: 1
I've spotted two problems with your code.
First of all, the alias of Binary Cross Entropy loss is binary_crossentropy
, so I got an error running your code on my machine.
Second thing. If you reduce your layers definition to
model.add(tf.keras.Input(shape=(3)))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(6, activation=tf.keras.activations.sigmoid))
Then it should work, at least on my machine it did. But also because you are training for 500 on such a small dataset with such a deep network and then predicting the same dataset, what you will get is actually almost complete memorization of this dataset.
英文:
I've spotted two problems with your code.
First of all, the alias of Binary Cross Entropy loss is binary_crossentropy
so I got an error running your code on my machine.
Second thing. If you reduce your layers definition to
model.add(tf.keras.Input(shape=(3)))
model.add(tf.keras.layers.Dense(10, activation=tf.keras.activations.sigmoid))
model.add(tf.keras.layers.Dense(6, activation=tf.keras.activations.sigmoid))
Then it should work, at least on my machine it did. But also because you are training for 500 on such a small dataset with such a deep network and then predicting the same dataset, what you will get is actually almost complete memorisation of this dataset.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论