26 June, 2017

"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Keras Tutorial"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### by Anne Peter (anne.peter@uni-weimar.de)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 1. What is Keras?"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\"Keras is a high-level Neural Network library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. _Being able to go from idea to result with the least possible delay is key to doing good research._\" - https://keras.io/"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2. What is TensorFlow?"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\"TensorFlow is an open source software library for machine intelligence and neural networks.\" - https://www.tensorflow.org/"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 3. Artificial Neural Networks (ANN)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"ANNs are a computational model for simulating the complexity of the human brain.\n",
"They contain artificial neurons and connections between them as the biological brain. These neurons are also called units.\n", "ANNs can approximate any function and solve complicated problems without prior knowledge about it." ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### Units and Layers" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Units transform their input into some output through a so called activation function.

\n", "An unit gets its inputs from other units which are connected with it.

\n", "The strength of the (directed) connection between units is expressed by weights. The higher the weight, the stronger the connection and the influence of the other unit." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Units are arranged in layers:\n", "- input layer (always one) with input units\n", "- hidden layer (can be null or more) with hidden units\n", "- output layer (always one) with input units" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Biological neurons influence each other. If they are connected, the output of them effect each other. So the input of one neuron is composed of the outputs of the connected ones. Neurons \"fire\" if they are stimulated enough (if they have enough input).

\n", "In ANNs the weights of the connections are multiplied by the respectively outputs and summed up to be the input of the next unit. Each unit has an activation function, which tells how to set its output depending on the given input." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Common activation functions:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Training and Testing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The ANN gains new knowledge (learns) when the weights are modified. This phase is called training. Training data is passed to the ANN and the weights are changed due to a learning rule such as supervised learning.\n", "

When the ability of the ANN is need to be tested the testing phase begins. An unknown test set is given to see how the network performs on new data." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Terminology:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- one epoch = one pass of all training data\n", "- batch size = the number of training data in one pass (the higher the batch size, the more memory space you'll need)\n", "- number of iterations = number of passes, each pass using [batch size] number of examples" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

\n", "Now it is common to have neural networks with 10+ layers and even 100+ layer ANNs are being tried. Therefore they are called

\n", "Keras offers two different APIs to construct a model: a functional and a sequential one. We’re using the sequential API so we import Sequential from keras.models.

\n", "There are a bunch of different layer types available in Keras. These different types of layer help us to model individual kinds of neural nets for various machine learning tasks. In our specific case the Dense layer is what we want." ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import numpy as np\n", "from keras.models import Sequential\n", "from keras.layers.core import Activation, Dense" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then we have two sets of data, the training data and target/solution data, to tell the ANN the expected outcome (supervised learning).

\n", "We initialize training_data as a two-dimensional array (an array of arrays) where each of the inner arrays has exactly two items. Each of these pairs has a corresponding expected result in target_data.

\n", "We setup target_data as another two-dimensional array. All the inner arrays in target_data contain just a single item. Each inner array of training_data relates to its counterpart in target_data. That’s essentially what we want the neural net to learn over time. The value [0, 0] means 0, [0, 1] means 1 and so on." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# the four different states of XOR\n", "training_data = np.array([[0,0],[0,1],[1,0],[1,1]], \"float32\")\n", "\n", "# the four expected results in the same order\n", "target_data = np.array([[0],[1],[1],[0]], \"float32\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The first line sets up an empty model using the Sequential API. This is used to implement simple models. You simply keep adding layers to the existing model.

\n", "We’re adding a Dense layer to our model. We set input_dim = 2 because each of our input samples is an array of length 2 ([0, 1], [1, 0] etc.). If we had input data such as [0, 1, 1] our input_dim would be 3.

\n", "The more interesting question is: What does the 16 stand for? It’s the dimension of the output for this layer. If we think about our model in terms of neurons it means that we have two input neurons (input_dim = 2) spreading into 16 neurons in a so called hidden layer.

\n", "We also added another layer with an output dimension of 1 and without an explicit input dimension. In this case the input dimension is implicitly bound to be 16 since that’s the output dimension of the previous layer.

\n", "By setting activation = 'relu' (rectified linear unit) we specify that we want to use the rectifier function as the activation function.

\n", "By setting activation = 'sigmoid' we specify that we want to use the sigmoid activation function." ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": true }, "outputs": [], "source": [ "model = Sequential()\n", "\n", "# add Dense Layers to model\n", "model.add(Dense(16, input_dim = 2, activation = 'relu'))\n", "model.add(Dense(1, activation = 'sigmoid'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can visualize our model like this:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Tip: As a rule of thumb the model should be big enough to deal with the task but not bigger. If the model is too big it may start finding pattern in your data that are actually irrelevant for the problem at hand. Keeping the model at a reasonable size means it’s forced to look at the relevant pattern." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There’s one last thing we have to do before we can start training our model. We have to configure the learning process by calling model.compile(...) with a set of parameters." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": true }, "outputs": [], "source": [ "model.compile(loss = 'mean_squared_error', # the objective that the model will try to minimize\n", " optimizer = 'adam', # optimizer finds the right adjustments for the weights\n", " metrics = ['binary_accuracy']) # mectric to judge the performance of the model" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "In order for the neural network to be able to make the right adjustments to the weights we need to be able to tell how good our model is performing. Or to be more specific, with neural nets we always want to calculate a number that tells us how bad our model performs and then try to get that number lower.\n", "\n", "That number is the so called loss and we can decide how the loss is calculated. Similar to how we picked relu as our activation function we picked mean_squared_error as our loss function simply because it's a well proven loss function. We could change it to binary_crossentropy and our model would still continue to work. Different loss functions serve specific use cases. But it does not matter for our little XOR problem." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "That brings us to the next parameter, the optimizer. The job of the optimizer is it to find the right adjustments for the weights. I'm sure by now you may guess how we picked adam as our optimizer of choice. Right, because it's a well proven one!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The third parameter, metrics is actually much more interesting for our learning efforts. Here we can specify which metrics to collect during the training. We are interested in the binary_accuracy which gives us access to a number that tells us exactly how accurate our predictions are. More on that later." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "And that’s all we have to set up before we can start training our model. We kick off the training by calling model.fit(...) with a bunch of parameters.\n", "\n", "The first two parameters are training and target data, the third one is the number of epochs (learning iterations) and the last one tells keras how much info to print out during the training." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Once the training phase finished we can start making predictions with model.predict(...)." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/300\n", "0s - loss: 0.2626 - binary_accuracy: 0.5000\n", "Epoch 2/300\n", "0s - loss: 0.2623 - binary_accuracy: 0.2500\n", "Epoch 3/300\n", "0s - loss: 0.2617 - binary_accuracy: 0.2500\n", "Epoch 4/300\n", "0s - loss: 0.2612 - binary_accuracy: 0.2500\n", "Epoch 5/300\n", "0s - loss: 0.2607 - binary_accuracy: 0.2500\n", "Epoch 6/300\n", "0s - loss: 0.2602 - binary_accuracy: 0.2500\n", "Epoch 7/300\n", "0s - loss: 0.2597 - binary_accuracy: 0.2500\n", "Epoch 8/300\n", "0s - loss: 0.2593 - binary_accuracy: 0.2500\n", "Epoch 9/300\n", "0s - loss: 0.2588 - binary_accuracy: 0.2500\n", "Epoch 10/300\n", "0s - loss: 0.2583 - binary_accuracy: 0.2500\n", "Epoch 11/300\n", "0s - loss: 0.2578 - binary_accuracy: 0.2500\n", "Epoch 12/300\n", "0s - loss: 0.2574 - binary_accuracy: 0.2500\n", "Epoch 13/300\n", "0s - loss: 0.2569 - binary_accuracy: 0.2500\n", "Epoch 14/300\n", "0s - loss: 0.2564 - binary_accuracy: 0.2500\n", "Epoch 15/300\n", "0s - loss: 0.2560 - binary_accuracy: 0.2500\n", "Epoch 16/300\n", "0s - loss: 0.2555 - binary_accuracy: 0.2500\n", "Epoch 17/300\n", "0s - loss: 0.2551 - binary_accuracy: 0.2500\n", "Epoch 18/300\n", "0s - loss: 0.2547 - binary_accuracy: 0.2500\n", "Epoch 19/300\n", "0s - loss: 0.2542 - binary_accuracy: 0.2500\n", "Epoch 20/300\n", "0s - loss: 0.2538 - binary_accuracy: 0.2500\n", "Epoch 21/300\n", "0s - loss: 0.2534 - binary_accuracy: 0.2500\n", "Epoch 22/300\n", "0s - loss: 0.2530 - binary_accuracy: 0.2500\n", "Epoch 23/300\n", "0s - loss: 0.2526 - binary_accuracy: 0.2500\n", "Epoch 24/300\n", "0s - loss: 0.2522 - binary_accuracy: 0.2500\n", "Epoch 25/300\n", "0s - loss: 0.2518 - binary_accuracy: 0.2500\n", "Epoch 26/300\n", "0s - loss: 0.2514 - binary_accuracy: 0.2500\n", "Epoch 27/300\n", "0s - loss: 0.2510 - binary_accuracy: 0.2500\n", "Epoch 28/300\n", "0s - loss: 0.2506 - binary_accuracy: 0.2500\n", "Epoch 29/300\n", "0s - loss: 0.2503 - binary_accuracy: 0.2500\n", "Epoch 30/300\n", "0s - loss: 0.2499 - binary_accuracy: 0.2500\n", "Epoch 31/300\n", "0s - loss: 0.2495 - binary_accuracy: 0.2500\n", "Epoch 32/300\n", "0s - loss: 0.2492 - binary_accuracy: 0.2500\n", "Epoch 33/300\n", "0s - loss: 0.2488 - binary_accuracy: 0.2500\n", "Epoch 34/300\n", "0s - loss: 0.2485 - binary_accuracy: 0.2500\n", "Epoch 35/300\n", "0s - loss: 0.2482 - binary_accuracy: 0.2500\n", "Epoch 36/300\n", "0s - loss: 0.2478 - binary_accuracy: 0.2500\n", "Epoch 37/300\n", "0s - loss: 0.2475 - binary_accuracy: 0.2500\n", "Epoch 38/300\n", "0s - loss: 0.2472 - binary_accuracy: 0.2500\n", "Epoch 39/300\n", "0s - loss: 0.2469 - binary_accuracy: 0.2500\n", "Epoch 40/300\n", "0s - loss: 0.2466 - binary_accuracy: 0.2500\n", "Epoch 41/300\n", "0s - loss: 0.2463 - binary_accuracy: 0.2500\n", "Epoch 42/300\n", "0s - loss: 0.2460 - binary_accuracy: 0.2500\n", "Epoch 43/300\n", "0s - loss: 0.2457 - binary_accuracy: 0.5000\n", "Epoch 44/300\n", "0s - loss: 0.2454 - binary_accuracy: 0.5000\n", "Epoch 45/300\n", "0s - loss: 0.2451 - binary_accuracy: 0.5000\n", "Epoch 46/300\n", "0s - loss: 0.2448 - binary_accuracy: 0.5000\n", "Epoch 47/300\n", "0s - loss: 0.2445 - binary_accuracy: 0.5000\n", "Epoch 48/300\n", "0s - loss: 0.2442 - binary_accuracy: 0.5000\n", "Epoch 49/300\n", "0s - loss: 0.2440 - binary_accuracy: 0.5000\n", "Epoch 50/300\n", "0s - loss: 0.2437 - binary_accuracy: 0.5000\n", "Epoch 51/300\n", "0s - loss: 0.2435 - binary_accuracy: 0.5000\n", "Epoch 52/300\n", "0s - loss: 0.2432 - binary_accuracy: 0.5000\n", "Epoch 53/300\n", "0s - loss: 0.2430 - binary_accuracy: 0.5000\n", "Epoch 54/300\n", "0s - loss: 0.2427 - binary_accuracy: 0.5000\n", "Epoch 55/300\n", "0s - loss: 0.2425 - binary_accuracy: 0.5000\n", "Epoch 56/300\n", "0s - loss: 0.2422 - binary_accuracy: 0.5000\n", "Epoch 57/300\n", "0s - loss: 0.2419 - binary_accuracy: 0.5000\n", "Epoch 58/300\n", "0s - loss: 0.2417 - binary_accuracy: 0.5000\n", "Epoch 59/300\n", "0s - loss: 0.2414 - binary_accuracy: 0.5000\n", "Epoch 60/300\n", "0s - loss: 0.2412 - binary_accuracy: 0.5000\n", "Epoch 61/300\n", "0s - loss: 0.2410 - binary_accuracy: 0.5000\n", "Epoch 62/300\n", "0s - loss: 0.2407 - binary_accuracy: 0.5000\n", "Epoch 63/300\n", "0s - loss: 0.2405 - binary_accuracy: 0.5000\n", "Epoch 64/300\n", "0s - loss: 0.2403 - binary_accuracy: 0.5000\n", "Epoch 65/300\n", "0s - loss: 0.2400 - binary_accuracy: 0.5000\n", "Epoch 66/300\n", "0s - loss: 0.2398 - binary_accuracy: 0.5000\n", "Epoch 67/300\n", "0s - loss: 0.2396 - binary_accuracy: 0.5000\n", "Epoch 68/300\n", "0s - loss: 0.2394 - binary_accuracy: 0.5000\n", "Epoch 69/300\n", "0s - loss: 0.2391 - binary_accuracy: 0.5000\n", "Epoch 70/300\n", "0s - loss: 0.2389 - binary_accuracy: 0.5000\n", "Epoch 71/300\n", "0s - loss: 0.2387 - binary_accuracy: 0.5000\n", "Epoch 72/300\n", "0s - loss: 0.2384 - binary_accuracy: 0.5000\n", "Epoch 73/300\n", "0s - loss: 0.2382 - binary_accuracy: 0.5000\n", "Epoch 74/300\n", "0s - loss: 0.2380 - binary_accuracy: 0.5000\n", "Epoch 75/300\n", "0s - loss: 0.2378 - binary_accuracy: 0.5000\n", "Epoch 76/300\n", "0s - loss: 0.2375 - binary_accuracy: 0.5000\n", "Epoch 77/300\n", "0s - loss: 0.2373 - binary_accuracy: 0.5000\n", "Epoch 78/300\n", "0s - loss: 0.2371 - binary_accuracy: 0.5000\n", "Epoch 79/300\n", "0s - loss: 0.2369 - binary_accuracy: 0.5000\n", "Epoch 80/300\n", "0s - loss: 0.2367 - binary_accuracy: 0.5000\n", "Epoch 81/300\n", "0s - loss: 0.2364 - binary_accuracy: 0.5000\n", "Epoch 82/300\n", "0s - loss: 0.2362 - binary_accuracy: 0.5000\n", "Epoch 83/300\n", "0s - loss: 0.2360 - binary_accuracy: 0.5000\n", "Epoch 84/300\n", "0s - loss: 0.2358 - binary_accuracy: 0.5000\n", "Epoch 85/300\n", "0s - loss: 0.2355 - binary_accuracy: 0.5000\n", "Epoch 86/300\n", "0s - loss: 0.2353 - binary_accuracy: 0.5000\n", "Epoch 87/300\n", "0s - loss: 0.2351 - binary_accuracy: 0.5000\n", "Epoch 88/300\n", "0s - loss: 0.2349 - binary_accuracy: 0.5000\n", "Epoch 89/300\n", "0s - loss: 0.2346 - binary_accuracy: 0.7500\n", "Epoch 90/300\n", "0s - loss: 0.2344 - binary_accuracy: 0.7500\n", "Epoch 91/300\n", "0s - loss: 0.2342 - binary_accuracy: 0.7500\n", "Epoch 92/300\n", "0s - loss: 0.2340 - binary_accuracy: 0.7500\n", "Epoch 93/300\n", "0s - loss: 0.2337 - binary_accuracy: 0.7500\n", "Epoch 94/300\n", "0s - loss: 0.2335 - binary_accuracy: 0.7500\n", "Epoch 95/300\n", "0s - loss: 0.2333 - binary_accuracy: 0.7500\n", "Epoch 96/300\n", "0s - loss: 0.2331 - binary_accuracy: 0.7500\n", "Epoch 97/300\n", "0s - loss: 0.2328 - binary_accuracy: 0.7500\n", "Epoch 98/300\n", "0s - loss: 0.2326 - binary_accuracy: 0.7500\n", "Epoch 99/300\n", "0s - loss: 0.2324 - binary_accuracy: 0.7500\n", "Epoch 100/300\n", "0s - loss: 0.2321 - binary_accuracy: 0.7500\n", "Epoch 101/300\n", "0s - loss: 0.2319 - binary_accuracy: 0.7500\n", "Epoch 102/300\n", "0s - loss: 0.2317 - binary_accuracy: 0.7500\n", "Epoch 103/300\n", "0s - loss: 0.2315 - binary_accuracy: 0.7500\n", "Epoch 104/300\n", "0s - loss: 0.2312 - binary_accuracy: 0.7500\n", "Epoch 105/300\n", "0s - loss: 0.2310 - binary_accuracy: 0.7500\n", "Epoch 106/300\n", "0s - loss: 0.2308 - binary_accuracy: 0.7500\n", "Epoch 107/300\n", "0s - loss: 0.2305 - binary_accuracy: 0.7500\n", "Epoch 108/300\n", "0s - loss: 0.2303 - binary_accuracy: 0.7500\n", "Epoch 109/300\n", "0s - loss: 0.2301 - binary_accuracy: 0.7500\n", "Epoch 110/300\n", "0s - loss: 0.2298 - binary_accuracy: 0.7500\n", "Epoch 111/300\n", "0s - loss: 0.2296 - binary_accuracy: 0.7500\n", "Epoch 112/300\n", "0s - loss: 0.2294 - binary_accuracy: 0.7500\n", "Epoch 113/300\n", "0s - loss: 0.2291 - binary_accuracy: 0.7500\n", "Epoch 114/300\n", "0s - loss: 0.2289 - binary_accuracy: 0.7500\n", "Epoch 115/300\n", "0s - loss: 0.2286 - binary_accuracy: 0.7500\n", "Epoch 116/300\n", "0s - loss: 0.2284 - binary_accuracy: 0.7500\n", "Epoch 117/300\n", "0s - loss: 0.2282 - binary_accuracy: 0.7500\n", "Epoch 118/300\n", "0s - loss: 0.2279 - binary_accuracy: 0.7500\n", "Epoch 119/300\n", "0s - loss: 0.2277 - binary_accuracy: 0.7500\n", "Epoch 120/300\n", "0s - loss: 0.2274 - binary_accuracy: 0.7500\n", "Epoch 121/300\n", "0s - loss: 0.2272 - binary_accuracy: 0.7500\n", "Epoch 122/300\n", "0s - loss: 0.2270 - binary_accuracy: 0.7500\n", "Epoch 123/300\n", "0s - loss: 0.2267 - binary_accuracy: 0.7500\n", "Epoch 124/300\n", "0s - loss: 0.2265 - binary_accuracy: 0.7500\n", "Epoch 125/300\n", "0s - loss: 0.2262 - binary_accuracy: 0.7500\n", "Epoch 126/300\n", "0s - loss: 0.2260 - binary_accuracy: 0.7500\n", "Epoch 127/300\n", "0s - loss: 0.2257 - binary_accuracy: 0.7500\n", "Epoch 128/300\n", "0s - loss: 0.2255 - binary_accuracy: 0.7500\n", "Epoch 129/300\n", "0s - loss: 0.2252 - binary_accuracy: 0.7500\n", "Epoch 130/300\n", "0s - loss: 0.2250 - binary_accuracy: 0.7500\n", "Epoch 131/300\n", "0s - loss: 0.2247 - binary_accuracy: 0.7500\n", "Epoch 132/300\n", "0s - loss: 0.2245 - binary_accuracy: 0.7500\n", "Epoch 133/300\n", "0s - loss: 0.2242 - binary_accuracy: 0.7500\n", "Epoch 134/300\n", "0s - loss: 0.2240 - binary_accuracy: 0.7500\n", "Epoch 135/300\n", "0s - loss: 0.2237 - binary_accuracy: 0.7500\n", "Epoch 136/300\n", "0s - loss: 0.2235 - binary_accuracy: 0.7500\n", "Epoch 137/300\n", "0s - loss: 0.2232 - binary_accuracy: 0.7500\n", "Epoch 138/300\n", "0s - loss: 0.2230 - binary_accuracy: 0.7500\n", "Epoch 139/300\n", "0s - loss: 0.2227 - binary_accuracy: 0.7500\n", "Epoch 140/300\n", "0s - loss: 0.2225 - binary_accuracy: 0.7500\n", "Epoch 141/300\n", "0s - loss: 0.2222 - binary_accuracy: 0.7500\n", "Epoch 142/300\n", "0s - loss: 0.2219 - binary_accuracy: 0.7500\n", "Epoch 143/300\n", "0s - loss: 0.2217 - binary_accuracy: 0.7500\n", "Epoch 144/300\n", "0s - loss: 0.2214 - binary_accuracy: 0.7500\n", "Epoch 145/300\n", "0s - loss: 0.2212 - binary_accuracy: 0.7500\n", "Epoch 146/300\n", "0s - loss: 0.2209 - binary_accuracy: 0.7500\n", "Epoch 147/300\n", "0s - loss: 0.2206 - binary_accuracy: 0.7500\n", "Epoch 148/300\n", "0s - loss: 0.2204 - binary_accuracy: 0.7500\n", "Epoch 149/300\n", "0s - loss: 0.2201 - binary_accuracy: 0.7500\n", "Epoch 150/300\n", "0s - loss: 0.2199 - binary_accuracy: 0.7500\n", "Epoch 151/300\n", "0s - loss: 0.2196 - binary_accuracy: 0.7500\n", "Epoch 152/300\n", "0s - loss: 0.2193 - binary_accuracy: 0.7500\n", "Epoch 153/300\n", "0s - loss: 0.2191 - binary_accuracy: 0.7500\n", "Epoch 154/300\n", "0s - loss: 0.2188 - binary_accuracy: 0.7500\n", "Epoch 155/300\n", "0s - loss: 0.2185 - binary_accuracy: 0.7500\n", "Epoch 156/300\n", "0s - loss: 0.2182 - binary_accuracy: 0.7500\n", "Epoch 157/300\n", "0s - loss: 0.2180 - binary_accuracy: 0.7500\n", "Epoch 158/300\n", "0s - loss: 0.2177 - binary_accuracy: 0.7500\n", "Epoch 159/300\n", "0s - loss: 0.2174 - binary_accuracy: 0.7500\n", "Epoch 160/300\n", "0s - loss: 0.2172 - binary_accuracy: 0.7500\n", "Epoch 161/300\n", "0s - loss: 0.2169 - binary_accuracy: 0.7500\n", "Epoch 162/300\n", "0s - loss: 0.2166 - binary_accuracy: 0.7500\n", "Epoch 163/300\n", "0s - loss: 0.2163 - binary_accuracy: 0.7500\n", "Epoch 164/300\n", "0s - loss: 0.2161 - binary_accuracy: 0.7500\n", "Epoch 165/300\n", "0s - loss: 0.2158 - binary_accuracy: 0.7500\n", "Epoch 166/300\n", "0s - loss: 0.2155 - binary_accuracy: 0.7500\n", "Epoch 167/300\n", "0s - loss: 0.2152 - binary_accuracy: 0.7500\n", "Epoch 168/300\n", "0s - loss: 0.2150 - binary_accuracy: 0.7500\n", "Epoch 169/300\n", "0s - loss: 0.2147 - binary_accuracy: 0.7500\n", "Epoch 170/300\n", "0s - loss: 0.2144 - binary_accuracy: 0.7500\n", "Epoch 171/300\n", "0s - loss: 0.2141 - binary_accuracy: 0.7500\n", "Epoch 172/300\n", "0s - loss: 0.2138 - binary_accuracy: 0.7500\n", "Epoch 173/300\n", "0s - loss: 0.2135 - binary_accuracy: 0.7500\n", "Epoch 174/300\n", "0s - loss: 0.2133 - binary_accuracy: 0.7500\n", "Epoch 175/300\n", "0s - loss: 0.2130 - binary_accuracy: 0.7500\n", "Epoch 176/300\n", "0s - loss: 0.2127 - binary_accuracy: 0.7500\n", "Epoch 177/300\n", "0s - loss: 0.2124 - binary_accuracy: 0.7500\n", "Epoch 178/300\n", "0s - loss: 0.2121 - binary_accuracy: 0.7500\n", "Epoch 179/300\n", "0s - loss: 0.2118 - binary_accuracy: 0.7500\n", "Epoch 180/300\n", "0s - loss: 0.2115 - binary_accuracy: 0.7500\n", "Epoch 181/300\n", "0s - loss: 0.2112 - binary_accuracy: 0.7500\n", "Epoch 182/300\n", "0s - loss: 0.2109 - binary_accuracy: 0.7500\n", "Epoch 183/300\n", "0s - loss: 0.2106 - binary_accuracy: 0.7500\n", "Epoch 184/300\n", "0s - loss: 0.2103 - binary_accuracy: 0.7500\n", "Epoch 185/300\n", "0s - loss: 0.2100 - binary_accuracy: 0.7500\n", "Epoch 186/300\n", "0s - loss: 0.2097 - binary_accuracy: 0.7500\n", "Epoch 187/300\n", "0s - loss: 0.2094 - binary_accuracy: 0.7500\n", "Epoch 188/300\n", "0s - loss: 0.2091 - binary_accuracy: 0.7500\n", "Epoch 189/300\n", "0s - loss: 0.2088 - binary_accuracy: 0.7500\n", "Epoch 190/300\n", "0s - loss: 0.2085 - binary_accuracy: 0.7500\n", "Epoch 191/300\n", "0s - loss: 0.2082 - binary_accuracy: 0.7500\n", "Epoch 192/300\n", "0s - loss: 0.2079 - binary_accuracy: 0.7500\n", "Epoch 193/300\n", "0s - loss: 0.2076 - binary_accuracy: 0.7500\n", "Epoch 194/300\n", "0s - loss: 0.2073 - binary_accuracy: 0.7500\n", "Epoch 195/300\n", "0s - loss: 0.2070 - binary_accuracy: 0.7500\n", "Epoch 196/300\n", "0s - loss: 0.2067 - binary_accuracy: 0.7500\n", "Epoch 197/300\n", "0s - loss: 0.2064 - binary_accuracy: 0.7500\n", "Epoch 198/300\n", "0s - loss: 0.2061 - binary_accuracy: 0.7500\n", "Epoch 199/300\n", "0s - loss: 0.2058 - binary_accuracy: 0.7500\n", "Epoch 200/300\n", "0s - loss: 0.2055 - binary_accuracy: 0.7500\n", "Epoch 201/300\n", "0s - loss: 0.2051 - binary_accuracy: 0.7500\n", "Epoch 202/300\n", "0s - loss: 0.2048 - binary_accuracy: 0.7500\n", "Epoch 203/300\n", "0s - loss: 0.2045 - binary_accuracy: 0.7500\n", "Epoch 204/300\n", "0s - loss: 0.2042 - binary_accuracy: 0.7500\n", "Epoch 205/300\n", "0s - loss: 0.2039 - binary_accuracy: 0.7500\n", "Epoch 206/300\n", "0s - loss: 0.2036 - binary_accuracy: 0.7500\n", "Epoch 207/300\n", "0s - loss: 0.2032 - binary_accuracy: 0.7500\n", "Epoch 208/300\n", "0s - loss: 0.2029 - binary_accuracy: 0.7500\n", "Epoch 209/300\n", "0s - loss: 0.2026 - binary_accuracy: 0.7500\n", "Epoch 210/300\n", "0s - loss: 0.2023 - binary_accuracy: 0.7500\n", "Epoch 211/300\n", "0s - loss: 0.2020 - binary_accuracy: 0.7500\n", "Epoch 212/300\n", "0s - loss: 0.2016 - binary_accuracy: 0.7500\n", "Epoch 213/300\n", "0s - loss: 0.2013 - binary_accuracy: 0.7500\n", "Epoch 214/300\n", "0s - loss: 0.2010 - binary_accuracy: 0.7500\n", "Epoch 215/300\n", "0s - loss: 0.2007 - binary_accuracy: 0.7500\n", "Epoch 216/300\n", "0s - loss: 0.2003 - binary_accuracy: 0.7500\n", "Epoch 217/300\n", "0s - loss: 0.2000 - binary_accuracy: 0.7500\n", "Epoch 218/300\n", "0s - loss: 0.1997 - binary_accuracy: 0.7500\n", "Epoch 219/300\n", "0s - loss: 0.1994 - binary_accuracy: 0.7500\n", "Epoch 220/300\n", "0s - loss: 0.1990 - binary_accuracy: 0.7500\n", "Epoch 221/300\n", "0s - loss: 0.1987 - binary_accuracy: 0.7500\n", "Epoch 222/300\n", "0s - loss: 0.1984 - binary_accuracy: 0.7500\n", "Epoch 223/300\n", "0s - loss: 0.1980 - binary_accuracy: 0.7500\n", "Epoch 224/300\n", "0s - loss: 0.1977 - binary_accuracy: 0.7500\n", "Epoch 225/300\n", "0s - loss: 0.1974 - binary_accuracy: 0.7500\n", "Epoch 226/300\n", "0s - loss: 0.1970 - binary_accuracy: 0.7500\n", "Epoch 227/300\n", "0s - loss: 0.1967 - binary_accuracy: 0.7500\n", "Epoch 228/300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "0s - loss: 0.1964 - binary_accuracy: 0.7500\n", "Epoch 229/300\n", "0s - loss: 0.1961 - binary_accuracy: 0.7500\n", "Epoch 230/300\n", "0s - loss: 0.1957 - binary_accuracy: 0.7500\n", "Epoch 231/300\n", "0s - loss: 0.1954 - binary_accuracy: 0.7500\n", "Epoch 232/300\n", "0s - loss: 0.1950 - binary_accuracy: 0.7500\n", "Epoch 233/300\n", "0s - loss: 0.1947 - binary_accuracy: 0.7500\n", "Epoch 234/300\n", "0s - loss: 0.1944 - binary_accuracy: 0.7500\n", "Epoch 235/300\n", "0s - loss: 0.1940 - binary_accuracy: 0.7500\n", "Epoch 236/300\n", "0s - loss: 0.1937 - binary_accuracy: 0.7500\n", "Epoch 237/300\n", "0s - loss: 0.1933 - binary_accuracy: 0.7500\n", "Epoch 238/300\n", "0s - loss: 0.1930 - binary_accuracy: 0.7500\n", "Epoch 239/300\n", "0s - loss: 0.1926 - binary_accuracy: 0.7500\n", "Epoch 240/300\n", "0s - loss: 0.1923 - binary_accuracy: 0.7500\n", "Epoch 241/300\n", "0s - loss: 0.1919 - binary_accuracy: 0.7500\n", "Epoch 242/300\n", "0s - loss: 0.1916 - binary_accuracy: 0.7500\n", "Epoch 243/300\n", "0s - loss: 0.1912 - binary_accuracy: 0.7500\n", "Epoch 244/300\n", "0s - loss: 0.1909 - binary_accuracy: 0.7500\n", "Epoch 245/300\n", "0s - loss: 0.1906 - binary_accuracy: 0.7500\n", "Epoch 246/300\n", "0s - loss: 0.1902 - binary_accuracy: 0.7500\n", "Epoch 247/300\n", "0s - loss: 0.1898 - binary_accuracy: 0.7500\n", "Epoch 248/300\n", "0s - loss: 0.1895 - binary_accuracy: 0.7500\n", "Epoch 249/300\n", "0s - loss: 0.1892 - binary_accuracy: 0.7500\n", "Epoch 250/300\n", "0s - loss: 0.1888 - binary_accuracy: 0.7500\n", "Epoch 251/300\n", "0s - loss: 0.1884 - binary_accuracy: 0.7500\n", "Epoch 252/300\n", "0s - loss: 0.1881 - binary_accuracy: 0.7500\n", "Epoch 253/300\n", "0s - loss: 0.1877 - binary_accuracy: 0.7500\n", "Epoch 254/300\n", "0s - loss: 0.1874 - binary_accuracy: 0.7500\n", "Epoch 255/300\n", "0s - loss: 0.1870 - binary_accuracy: 0.7500\n", "Epoch 256/300\n", "0s - loss: 0.1867 - binary_accuracy: 0.7500\n", "Epoch 257/300\n", "0s - loss: 0.1863 - binary_accuracy: 0.7500\n", "Epoch 258/300\n", "0s - loss: 0.1860 - binary_accuracy: 1.0000\n", "Epoch 259/300\n", "0s - loss: 0.1856 - binary_accuracy: 1.0000\n", "Epoch 260/300\n", "0s - loss: 0.1852 - binary_accuracy: 1.0000\n", "Epoch 261/300\n", "0s - loss: 0.1849 - binary_accuracy: 1.0000\n", "Epoch 262/300\n", "0s - loss: 0.1845 - binary_accuracy: 1.0000\n", "Epoch 263/300\n", "0s - loss: 0.1841 - binary_accuracy: 1.0000\n", "Epoch 264/300\n", "0s - loss: 0.1838 - binary_accuracy: 1.0000\n", "Epoch 265/300\n", "0s - loss: 0.1834 - binary_accuracy: 1.0000\n", "Epoch 266/300\n", "0s - loss: 0.1831 - binary_accuracy: 1.0000\n", "Epoch 267/300\n", "0s - loss: 0.1827 - binary_accuracy: 1.0000\n", "Epoch 268/300\n", "0s - loss: 0.1824 - binary_accuracy: 1.0000\n", "Epoch 269/300\n", "0s - loss: 0.1820 - binary_accuracy: 1.0000\n", "Epoch 270/300\n", "0s - loss: 0.1816 - binary_accuracy: 1.0000\n", "Epoch 271/300\n", "0s - loss: 0.1813 - binary_accuracy: 1.0000\n", "Epoch 272/300\n", "0s - loss: 0.1809 - binary_accuracy: 1.0000\n", "Epoch 273/300\n", "0s - loss: 0.1806 - binary_accuracy: 1.0000\n", "Epoch 274/300\n", "0s - loss: 0.1802 - binary_accuracy: 1.0000\n", "Epoch 275/300\n", "0s - loss: 0.1798 - binary_accuracy: 1.0000\n", "Epoch 276/300\n", "0s - loss: 0.1795 - binary_accuracy: 1.0000\n", "Epoch 277/300\n", "0s - loss: 0.1791 - binary_accuracy: 1.0000\n", "Epoch 278/300\n", "0s - loss: 0.1787 - binary_accuracy: 1.0000\n", "Epoch 279/300\n", "0s - loss: 0.1784 - binary_accuracy: 1.0000\n", "Epoch 280/300\n", "0s - loss: 0.1780 - binary_accuracy: 1.0000\n", "Epoch 281/300\n", "0s - loss: 0.1776 - binary_accuracy: 1.0000\n", "Epoch 282/300\n", "0s - loss: 0.1773 - binary_accuracy: 1.0000\n", "Epoch 283/300\n", "0s - loss: 0.1769 - binary_accuracy: 1.0000\n", "Epoch 284/300\n", "0s - loss: 0.1765 - binary_accuracy: 1.0000\n", "Epoch 285/300\n", "0s - loss: 0.1761 - binary_accuracy: 1.0000\n", "Epoch 286/300\n", "0s - loss: 0.1758 - binary_accuracy: 1.0000\n", "Epoch 287/300\n", "0s - loss: 0.1754 - binary_accuracy: 1.0000\n", "Epoch 288/300\n", "0s - loss: 0.1750 - binary_accuracy: 1.0000\n", "Epoch 289/300\n", "0s - loss: 0.1747 - binary_accuracy: 1.0000\n", "Epoch 290/300\n", "0s - loss: 0.1743 - binary_accuracy: 1.0000\n", "Epoch 291/300\n", "0s - loss: 0.1739 - binary_accuracy: 1.0000\n", "Epoch 292/300\n", "0s - loss: 0.1735 - binary_accuracy: 1.0000\n", "Epoch 293/300\n", "0s - loss: 0.1732 - binary_accuracy: 1.0000\n", "Epoch 294/300\n", "0s - loss: 0.1728 - binary_accuracy: 1.0000\n", "Epoch 295/300\n", "0s - loss: 0.1725 - binary_accuracy: 1.0000\n", "Epoch 296/300\n", "0s - loss: 0.1721 - binary_accuracy: 1.0000\n", "Epoch 297/300\n", "0s - loss: 0.1717 - binary_accuracy: 1.0000\n", "Epoch 298/300\n", "0s - loss: 0.1714 - binary_accuracy: 1.0000\n", "Epoch 299/300\n", "0s - loss: 0.1710 - binary_accuracy: 1.0000\n", "Epoch 300/300\n", "0s - loss: 0.1706 - binary_accuracy: 1.0000\n", "\n", "Input after training:\n", " [[ 0. 0.]\n", " [ 0. 1.]\n", " [ 1. 0.]\n", " [ 1. 1.]]\n", "\n", "Prediction:\n", " [[ 0.]\n", " [ 1.]\n", " [ 1.]\n", " [ 0.]]\n" ] } ], "source": [ "model.fit(training_data, target_data, epochs=300, verbose=2)\n", "\n", "print(\"\\nInput after training:\\n\", training_data)\n", "print(\"\\nPrediction:\\n\", model.predict(training_data).round())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Please note that in a real world scenario our predictions would be tested against data that the neural network hasn’t seen during the training. That’s because we usually want to see if our model generalizes well. In other words, does it work with new data or does it just memorize all the data and expected results it had seen in the training phase? However, with this toy task there are really only our four states and four expected outputs. No way to proof generalization here.\n", "\n", "Also note that we are rounding the output to get clear binary answers. Neural networks calculate probabilities. If we wouldn’t round we would see something like 0.9993... and 0.00034... instead of 1 and 0 which isn’t exactly what we want." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Have a look at the output of our neural net.\n", "Since we’ve set verbose=2 and metrics=['binary_accuracy'] earlier we get all these nice infos after each epoch. The interesting number we want to focus on is binary_accuracy. Guess what the 0.5000 at the first epochs mean? If you’re thinking it means that our model predicts one out of our four states correctly you’re damn right. It took us many epochs to predict half of the four states correctly. After even more epochs the model makes perfect predictions for all of our four XOR states." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can start making changes to our model and see how it affects the performance. Let’s try to increase the size of our hidden layer from 16 to 32." ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/300\n", "0s - loss: 0.2676 - binary_accuracy: 0.5000\n", "Epoch 2/300\n", "0s - loss: 0.2663 - binary_accuracy: 0.2500\n", "Epoch 3/300\n", "0s - loss: 0.2650 - binary_accuracy: 0.2500\n", "Epoch 4/300\n", "0s - loss: 0.2638 - binary_accuracy: 0.2500\n", "Epoch 5/300\n", "0s - loss: 0.2625 - binary_accuracy: 0.2500\n", "Epoch 6/300\n", "0s - loss: 0.2613 - binary_accuracy: 0.2500\n", "Epoch 7/300\n", "0s - loss: 0.2600 - binary_accuracy: 0.2500\n", "Epoch 8/300\n", "0s - loss: 0.2588 - binary_accuracy: 0.2500\n", "Epoch 9/300\n", "0s - loss: 0.2576 - binary_accuracy: 0.2500\n", "Epoch 10/300\n", "0s - loss: 0.2564 - binary_accuracy: 0.2500\n", "Epoch 11/300\n", "0s - loss: 0.2553 - binary_accuracy: 0.2500\n", "Epoch 12/300\n", "0s - loss: 0.2542 - binary_accuracy: 0.2500\n", "Epoch 13/300\n", "0s - loss: 0.2531 - binary_accuracy: 0.2500\n", "Epoch 14/300\n", "0s - loss: 0.2520 - binary_accuracy: 0.2500\n", "Epoch 15/300\n", "0s - loss: 0.2509 - binary_accuracy: 0.2500\n", "Epoch 16/300\n", "0s - loss: 0.2499 - binary_accuracy: 0.2500\n", "Epoch 17/300\n", "0s - loss: 0.2489 - binary_accuracy: 0.2500\n", "Epoch 18/300\n", "0s - loss: 0.2479 - binary_accuracy: 0.2500\n", "Epoch 19/300\n", "0s - loss: 0.2469 - binary_accuracy: 0.2500\n", "Epoch 20/300\n", "0s - loss: 0.2459 - binary_accuracy: 0.2500\n", "Epoch 21/300\n", "0s - loss: 0.2449 - binary_accuracy: 0.2500\n", "Epoch 22/300\n", "0s - loss: 0.2440 - binary_accuracy: 0.2500\n", "Epoch 23/300\n", "0s - loss: 0.2430 - binary_accuracy: 0.2500\n", "Epoch 24/300\n", "0s - loss: 0.2421 - binary_accuracy: 0.2500\n", "Epoch 25/300\n", "0s - loss: 0.2412 - binary_accuracy: 0.2500\n", "Epoch 26/300\n", "0s - loss: 0.2404 - binary_accuracy: 0.2500\n", "Epoch 27/300\n", "0s - loss: 0.2395 - binary_accuracy: 0.5000\n", "Epoch 28/300\n", "0s - loss: 0.2387 - binary_accuracy: 0.5000\n", "Epoch 29/300\n", "0s - loss: 0.2378 - binary_accuracy: 0.5000\n", "Epoch 30/300\n", "0s - loss: 0.2370 - binary_accuracy: 0.5000\n", "Epoch 31/300\n", "0s - loss: 0.2362 - binary_accuracy: 0.5000\n", "Epoch 32/300\n", "0s - loss: 0.2355 - binary_accuracy: 0.5000\n", "Epoch 33/300\n", "0s - loss: 0.2348 - binary_accuracy: 0.5000\n", "Epoch 34/300\n", "0s - loss: 0.2341 - binary_accuracy: 0.5000\n", "Epoch 35/300\n", "0s - loss: 0.2334 - binary_accuracy: 0.5000\n", "Epoch 36/300\n", "0s - loss: 0.2327 - binary_accuracy: 0.5000\n", "Epoch 37/300\n", "0s - loss: 0.2320 - binary_accuracy: 0.5000\n", "Epoch 38/300\n", "0s - loss: 0.2314 - binary_accuracy: 0.5000\n", "Epoch 39/300\n", "0s - loss: 0.2308 - binary_accuracy: 0.5000\n", "Epoch 40/300\n", "0s - loss: 0.2302 - binary_accuracy: 0.5000\n", "Epoch 41/300\n", "0s - loss: 0.2296 - binary_accuracy: 0.5000\n", "Epoch 42/300\n", "0s - loss: 0.2290 - binary_accuracy: 0.5000\n", "Epoch 43/300\n", "0s - loss: 0.2284 - binary_accuracy: 0.5000\n", "Epoch 44/300\n", "0s - loss: 0.2279 - binary_accuracy: 0.5000\n", "Epoch 45/300\n", "0s - loss: 0.2273 - binary_accuracy: 0.5000\n", "Epoch 46/300\n", "0s - loss: 0.2267 - binary_accuracy: 0.5000\n", "Epoch 47/300\n", "0s - loss: 0.2262 - binary_accuracy: 0.5000\n", "Epoch 48/300\n", "0s - loss: 0.2256 - binary_accuracy: 0.5000\n", "Epoch 49/300\n", "0s - loss: 0.2251 - binary_accuracy: 0.5000\n", "Epoch 50/300\n", "0s - loss: 0.2245 - binary_accuracy: 0.5000\n", "Epoch 51/300\n", "0s - loss: 0.2240 - binary_accuracy: 0.5000\n", "Epoch 52/300\n", "0s - loss: 0.2235 - binary_accuracy: 0.5000\n", "Epoch 53/300\n", "0s - loss: 0.2230 - binary_accuracy: 0.7500\n", "Epoch 54/300\n", "0s - loss: 0.2225 - binary_accuracy: 0.7500\n", "Epoch 55/300\n", "0s - loss: 0.2221 - binary_accuracy: 0.7500\n", "Epoch 56/300\n", "0s - loss: 0.2216 - binary_accuracy: 0.7500\n", "Epoch 57/300\n", "0s - loss: 0.2211 - binary_accuracy: 0.7500\n", "Epoch 58/300\n", "0s - loss: 0.2206 - binary_accuracy: 0.7500\n", "Epoch 59/300\n", "0s - loss: 0.2201 - binary_accuracy: 0.7500\n", "Epoch 60/300\n", "0s - loss: 0.2197 - binary_accuracy: 0.7500\n", "Epoch 61/300\n", "0s - loss: 0.2192 - binary_accuracy: 0.7500\n", "Epoch 62/300\n", "0s - loss: 0.2187 - binary_accuracy: 0.7500\n", "Epoch 63/300\n", "0s - loss: 0.2183 - binary_accuracy: 0.7500\n", "Epoch 64/300\n", "0s - loss: 0.2178 - binary_accuracy: 0.7500\n", "Epoch 65/300\n", "0s - loss: 0.2173 - binary_accuracy: 0.7500\n", "Epoch 66/300\n", "0s - loss: 0.2169 - binary_accuracy: 0.7500\n", "Epoch 67/300\n", "0s - loss: 0.2164 - binary_accuracy: 0.7500\n", "Epoch 68/300\n", "0s - loss: 0.2160 - binary_accuracy: 0.7500\n", "Epoch 69/300\n", "0s - loss: 0.2155 - binary_accuracy: 0.7500\n", "Epoch 70/300\n", "0s - loss: 0.2151 - binary_accuracy: 0.7500\n", "Epoch 71/300\n", "0s - loss: 0.2147 - binary_accuracy: 0.7500\n", "Epoch 72/300\n", "0s - loss: 0.2142 - binary_accuracy: 0.7500\n", "Epoch 73/300\n", "0s - loss: 0.2138 - binary_accuracy: 0.7500\n", "Epoch 74/300\n", "0s - loss: 0.2133 - binary_accuracy: 0.7500\n", "Epoch 75/300\n", "0s - loss: 0.2129 - binary_accuracy: 0.7500\n", "Epoch 76/300\n", "0s - loss: 0.2124 - binary_accuracy: 0.7500\n", "Epoch 77/300\n", "0s - loss: 0.2120 - binary_accuracy: 0.7500\n", "Epoch 78/300\n", "0s - loss: 0.2115 - binary_accuracy: 0.7500\n", "Epoch 79/300\n", "0s - loss: 0.2111 - binary_accuracy: 0.7500\n", "Epoch 80/300\n", "0s - loss: 0.2106 - binary_accuracy: 0.7500\n", "Epoch 81/300\n", "0s - loss: 0.2102 - binary_accuracy: 0.7500\n", "Epoch 82/300\n", "0s - loss: 0.2098 - binary_accuracy: 0.7500\n", "Epoch 83/300\n", "0s - loss: 0.2094 - binary_accuracy: 0.7500\n", "Epoch 84/300\n", "0s - loss: 0.2089 - binary_accuracy: 0.7500\n", "Epoch 85/300\n", "0s - loss: 0.2085 - binary_accuracy: 0.7500\n", "Epoch 86/300\n", "0s - loss: 0.2080 - binary_accuracy: 0.7500\n", "Epoch 87/300\n", "0s - loss: 0.2076 - binary_accuracy: 0.7500\n", "Epoch 88/300\n", "0s - loss: 0.2071 - binary_accuracy: 0.7500\n", "Epoch 89/300\n", "0s - loss: 0.2067 - binary_accuracy: 0.7500\n", "Epoch 90/300\n", "0s - loss: 0.2063 - binary_accuracy: 0.7500\n", "Epoch 91/300\n", "0s - loss: 0.2059 - binary_accuracy: 0.7500\n", "Epoch 92/300\n", "0s - loss: 0.2055 - binary_accuracy: 0.7500\n", "Epoch 93/300\n", "0s - loss: 0.2051 - binary_accuracy: 0.7500\n", "Epoch 94/300\n", "0s - loss: 0.2046 - binary_accuracy: 0.7500\n", "Epoch 95/300\n", "0s - loss: 0.2042 - binary_accuracy: 0.7500\n", "Epoch 96/300\n", "0s - loss: 0.2038 - binary_accuracy: 0.7500\n", "Epoch 97/300\n", "0s - loss: 0.2034 - binary_accuracy: 0.7500\n", "Epoch 98/300\n", "0s - loss: 0.2030 - binary_accuracy: 0.7500\n", "Epoch 99/300\n", "0s - loss: 0.2026 - binary_accuracy: 0.7500\n", "Epoch 100/300\n", "0s - loss: 0.2021 - binary_accuracy: 0.7500\n", "Epoch 101/300\n", "0s - loss: 0.2017 - binary_accuracy: 0.7500\n", "Epoch 102/300\n", "0s - loss: 0.2013 - binary_accuracy: 0.7500\n", "Epoch 103/300\n", "0s - loss: 0.2009 - binary_accuracy: 0.7500\n", "Epoch 104/300\n", "0s - loss: 0.2005 - binary_accuracy: 0.7500\n", "Epoch 105/300\n", "0s - loss: 0.2000 - binary_accuracy: 0.7500\n", "Epoch 106/300\n", "0s - loss: 0.1996 - binary_accuracy: 0.7500\n", "Epoch 107/300\n", "0s - loss: 0.1992 - binary_accuracy: 0.7500\n", "Epoch 108/300\n", "0s - loss: 0.1988 - binary_accuracy: 0.7500\n", "Epoch 109/300\n", "0s - loss: 0.1984 - binary_accuracy: 0.7500\n", "Epoch 110/300\n", "0s - loss: 0.1980 - binary_accuracy: 0.7500\n", "Epoch 111/300\n", "0s - loss: 0.1976 - binary_accuracy: 0.7500\n", "Epoch 112/300\n", "0s - loss: 0.1971 - binary_accuracy: 0.7500\n", "Epoch 113/300\n", "0s - loss: 0.1967 - binary_accuracy: 0.7500\n", "Epoch 114/300\n", "0s - loss: 0.1963 - binary_accuracy: 0.7500\n", "Epoch 115/300\n", "0s - loss: 0.1959 - binary_accuracy: 0.7500\n", "Epoch 116/300\n", "0s - loss: 0.1955 - binary_accuracy: 0.7500\n", "Epoch 117/300\n", "0s - loss: 0.1950 - binary_accuracy: 0.7500\n", "Epoch 118/300\n", "0s - loss: 0.1946 - binary_accuracy: 0.7500\n", "Epoch 119/300\n", "0s - loss: 0.1942 - binary_accuracy: 0.7500\n", "Epoch 120/300\n", "0s - loss: 0.1938 - binary_accuracy: 0.7500\n", "Epoch 121/300\n", "0s - loss: 0.1933 - binary_accuracy: 0.7500\n", "Epoch 122/300\n", "0s - loss: 0.1929 - binary_accuracy: 0.7500\n", "Epoch 123/300\n", "0s - loss: 0.1925 - binary_accuracy: 0.7500\n", "Epoch 124/300\n", "0s - loss: 0.1921 - binary_accuracy: 0.7500\n", "Epoch 125/300\n", "0s - loss: 0.1916 - binary_accuracy: 0.7500\n", "Epoch 126/300\n", "0s - loss: 0.1912 - binary_accuracy: 0.7500\n", "Epoch 127/300\n", "0s - loss: 0.1908 - binary_accuracy: 0.7500\n", "Epoch 128/300\n", "0s - loss: 0.1903 - binary_accuracy: 0.7500\n", "Epoch 129/300\n", "0s - loss: 0.1899 - binary_accuracy: 0.7500\n", "Epoch 130/300\n", "0s - loss: 0.1895 - binary_accuracy: 0.7500\n", "Epoch 131/300\n", "0s - loss: 0.1891 - binary_accuracy: 0.7500\n", "Epoch 132/300\n", "0s - loss: 0.1886 - binary_accuracy: 0.7500\n", "Epoch 133/300\n", "0s - loss: 0.1882 - binary_accuracy: 0.7500\n", "Epoch 134/300\n", "0s - loss: 0.1878 - binary_accuracy: 0.7500\n", "Epoch 135/300\n", "0s - loss: 0.1873 - binary_accuracy: 0.7500\n", "Epoch 136/300\n", "0s - loss: 0.1869 - binary_accuracy: 0.7500\n", "Epoch 137/300\n", "0s - loss: 0.1865 - binary_accuracy: 0.7500\n", "Epoch 138/300\n", "0s - loss: 0.1861 - binary_accuracy: 0.7500\n", "Epoch 139/300\n", "0s - loss: 0.1856 - binary_accuracy: 0.7500\n", "Epoch 140/300\n", "0s - loss: 0.1852 - binary_accuracy: 0.7500\n", "Epoch 141/300\n", "0s - loss: 0.1848 - binary_accuracy: 0.7500\n", "Epoch 142/300\n", "0s - loss: 0.1844 - binary_accuracy: 0.7500\n", "Epoch 143/300\n", "0s - loss: 0.1839 - binary_accuracy: 0.7500\n", "Epoch 144/300\n", "0s - loss: 0.1835 - binary_accuracy: 0.7500\n", "Epoch 145/300\n", "0s - loss: 0.1831 - binary_accuracy: 0.7500\n", "Epoch 146/300\n", "0s - loss: 0.1826 - binary_accuracy: 0.7500\n", "Epoch 147/300\n", "0s - loss: 0.1822 - binary_accuracy: 0.7500\n", "Epoch 148/300\n", "0s - loss: 0.1818 - binary_accuracy: 0.7500\n", "Epoch 149/300\n", "0s - loss: 0.1813 - binary_accuracy: 1.0000\n", "Epoch 150/300\n", "0s - loss: 0.1809 - binary_accuracy: 1.0000\n", "Epoch 151/300\n", "0s - loss: 0.1805 - binary_accuracy: 1.0000\n", "Epoch 152/300\n", "0s - loss: 0.1800 - binary_accuracy: 1.0000\n", "Epoch 153/300\n", "0s - loss: 0.1796 - binary_accuracy: 1.0000\n", "Epoch 154/300\n", "0s - loss: 0.1792 - binary_accuracy: 1.0000\n", "Epoch 155/300\n", "0s - loss: 0.1787 - binary_accuracy: 1.0000\n", "Epoch 156/300\n", "0s - loss: 0.1783 - binary_accuracy: 1.0000\n", "Epoch 157/300\n", "0s - loss: 0.1779 - binary_accuracy: 1.0000\n", "Epoch 158/300\n", "0s - loss: 0.1774 - binary_accuracy: 1.0000\n", "Epoch 159/300\n", "0s - loss: 0.1770 - binary_accuracy: 1.0000\n", "Epoch 160/300\n", "0s - loss: 0.1765 - binary_accuracy: 1.0000\n", "Epoch 161/300\n", "0s - loss: 0.1761 - binary_accuracy: 1.0000\n", "Epoch 162/300\n", "0s - loss: 0.1757 - binary_accuracy: 1.0000\n", "Epoch 163/300\n", "0s - loss: 0.1752 - binary_accuracy: 1.0000\n", "Epoch 164/300\n", "0s - loss: 0.1748 - binary_accuracy: 1.0000\n", "Epoch 165/300\n", "0s - loss: 0.1744 - binary_accuracy: 1.0000\n", "Epoch 166/300\n", "0s - loss: 0.1739 - binary_accuracy: 1.0000\n", "Epoch 167/300\n", "0s - loss: 0.1735 - binary_accuracy: 1.0000\n", "Epoch 168/300\n", "0s - loss: 0.1730 - binary_accuracy: 1.0000\n", "Epoch 169/300\n", "0s - loss: 0.1726 - binary_accuracy: 1.0000\n", "Epoch 170/300\n", "0s - loss: 0.1722 - binary_accuracy: 1.0000\n", "Epoch 171/300\n", "0s - loss: 0.1717 - binary_accuracy: 1.0000\n", "Epoch 172/300\n", "0s - loss: 0.1713 - binary_accuracy: 1.0000\n", "Epoch 173/300\n", "0s - loss: 0.1708 - binary_accuracy: 1.0000\n", "Epoch 174/300\n", "0s - loss: 0.1704 - binary_accuracy: 1.0000\n", "Epoch 175/300\n", "0s - loss: 0.1700 - binary_accuracy: 1.0000\n", "Epoch 176/300\n", "0s - loss: 0.1695 - binary_accuracy: 1.0000\n", "Epoch 177/300\n", "0s - loss: 0.1691 - binary_accuracy: 1.0000\n", "Epoch 178/300\n", "0s - loss: 0.1686 - binary_accuracy: 1.0000\n", "Epoch 179/300\n", "0s - loss: 0.1682 - binary_accuracy: 1.0000\n", "Epoch 180/300\n", "0s - loss: 0.1677 - binary_accuracy: 1.0000\n", "Epoch 181/300\n", "0s - loss: 0.1673 - binary_accuracy: 1.0000\n", "Epoch 182/300\n", "0s - loss: 0.1668 - binary_accuracy: 1.0000\n", "Epoch 183/300\n", "0s - loss: 0.1663 - binary_accuracy: 1.0000\n", "Epoch 184/300\n", "0s - loss: 0.1658 - binary_accuracy: 1.0000\n", "Epoch 185/300\n", "0s - loss: 0.1653 - binary_accuracy: 1.0000\n", "Epoch 186/300\n", "0s - loss: 0.1649 - binary_accuracy: 1.0000\n", "Epoch 187/300\n", "0s - loss: 0.1644 - binary_accuracy: 1.0000\n", "Epoch 188/300\n", "0s - loss: 0.1639 - binary_accuracy: 1.0000\n", "Epoch 189/300\n", "0s - loss: 0.1634 - binary_accuracy: 1.0000\n", "Epoch 190/300\n", "0s - loss: 0.1629 - binary_accuracy: 1.0000\n", "Epoch 191/300\n", "0s - loss: 0.1624 - binary_accuracy: 1.0000\n", "Epoch 192/300\n", "0s - loss: 0.1620 - binary_accuracy: 1.0000\n", "Epoch 193/300\n", "0s - loss: 0.1615 - binary_accuracy: 1.0000\n", "Epoch 194/300\n", "0s - loss: 0.1610 - binary_accuracy: 1.0000\n", "Epoch 195/300\n", "0s - loss: 0.1605 - binary_accuracy: 1.0000\n", "Epoch 196/300\n", "0s - loss: 0.1600 - binary_accuracy: 1.0000\n", "Epoch 197/300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "0s - loss: 0.1595 - binary_accuracy: 1.0000\n", "Epoch 198/300\n", "0s - loss: 0.1590 - binary_accuracy: 1.0000\n", "Epoch 199/300\n", "0s - loss: 0.1585 - binary_accuracy: 1.0000\n", "Epoch 200/300\n", "0s - loss: 0.1580 - binary_accuracy: 1.0000\n", "Epoch 201/300\n", "0s - loss: 0.1576 - binary_accuracy: 1.0000\n", "Epoch 202/300\n", "0s - loss: 0.1571 - binary_accuracy: 1.0000\n", "Epoch 203/300\n", "0s - loss: 0.1566 - binary_accuracy: 1.0000\n", "Epoch 204/300\n", "0s - loss: 0.1561 - binary_accuracy: 1.0000\n", "Epoch 205/300\n", "0s - loss: 0.1556 - binary_accuracy: 1.0000\n", "Epoch 206/300\n", "0s - loss: 0.1551 - binary_accuracy: 1.0000\n", "Epoch 207/300\n", "0s - loss: 0.1546 - binary_accuracy: 1.0000\n", "Epoch 208/300\n", "0s - loss: 0.1541 - binary_accuracy: 1.0000\n", "Epoch 209/300\n", "0s - loss: 0.1537 - binary_accuracy: 1.0000\n", "Epoch 210/300\n", "0s - loss: 0.1532 - binary_accuracy: 1.0000\n", "Epoch 211/300\n", "0s - loss: 0.1527 - binary_accuracy: 1.0000\n", "Epoch 212/300\n", "0s - loss: 0.1522 - binary_accuracy: 1.0000\n", "Epoch 213/300\n", "0s - loss: 0.1517 - binary_accuracy: 1.0000\n", "Epoch 214/300\n", "0s - loss: 0.1512 - binary_accuracy: 1.0000\n", "Epoch 215/300\n", "0s - loss: 0.1507 - binary_accuracy: 1.0000\n", "Epoch 216/300\n", "0s - loss: 0.1503 - binary_accuracy: 1.0000\n", "Epoch 217/300\n", "0s - loss: 0.1498 - binary_accuracy: 1.0000\n", "Epoch 218/300\n", "0s - loss: 0.1493 - binary_accuracy: 1.0000\n", "Epoch 219/300\n", "0s - loss: 0.1488 - binary_accuracy: 1.0000\n", "Epoch 220/300\n", "0s - loss: 0.1483 - binary_accuracy: 1.0000\n", "Epoch 221/300\n", "0s - loss: 0.1478 - binary_accuracy: 1.0000\n", "Epoch 222/300\n", "0s - loss: 0.1474 - binary_accuracy: 1.0000\n", "Epoch 223/300\n", "0s - loss: 0.1469 - binary_accuracy: 1.0000\n", "Epoch 224/300\n", "0s - loss: 0.1464 - binary_accuracy: 1.0000\n", "Epoch 225/300\n", "0s - loss: 0.1459 - binary_accuracy: 1.0000\n", "Epoch 226/300\n", "0s - loss: 0.1454 - binary_accuracy: 1.0000\n", "Epoch 227/300\n", "0s - loss: 0.1450 - binary_accuracy: 1.0000\n", "Epoch 228/300\n", "0s - loss: 0.1445 - binary_accuracy: 1.0000\n", "Epoch 229/300\n", "0s - loss: 0.1440 - binary_accuracy: 1.0000\n", "Epoch 230/300\n", "0s - loss: 0.1435 - binary_accuracy: 1.0000\n", "Epoch 231/300\n", "0s - loss: 0.1431 - binary_accuracy: 1.0000\n", "Epoch 232/300\n", "0s - loss: 0.1426 - binary_accuracy: 1.0000\n", "Epoch 233/300\n", "0s - loss: 0.1421 - binary_accuracy: 1.0000\n", "Epoch 234/300\n", "0s - loss: 0.1416 - binary_accuracy: 1.0000\n", "Epoch 235/300\n", "0s - loss: 0.1411 - binary_accuracy: 1.0000\n", "Epoch 236/300\n", "0s - loss: 0.1407 - binary_accuracy: 1.0000\n", "Epoch 237/300\n", "0s - loss: 0.1402 - binary_accuracy: 1.0000\n", "Epoch 238/300\n", "0s - loss: 0.1397 - binary_accuracy: 1.0000\n", "Epoch 239/300\n", "0s - loss: 0.1393 - binary_accuracy: 1.0000\n", "Epoch 240/300\n", "0s - loss: 0.1388 - binary_accuracy: 1.0000\n", "Epoch 241/300\n", "0s - loss: 0.1383 - binary_accuracy: 1.0000\n", "Epoch 242/300\n", "0s - loss: 0.1378 - binary_accuracy: 1.0000\n", "Epoch 243/300\n", "0s - loss: 0.1374 - binary_accuracy: 1.0000\n", "Epoch 244/300\n", "0s - loss: 0.1369 - binary_accuracy: 1.0000\n", "Epoch 245/300\n", "0s - loss: 0.1364 - binary_accuracy: 1.0000\n", "Epoch 246/300\n", "0s - loss: 0.1360 - binary_accuracy: 1.0000\n", "Epoch 247/300\n", "0s - loss: 0.1355 - binary_accuracy: 1.0000\n", "Epoch 248/300\n", "0s - loss: 0.1350 - binary_accuracy: 1.0000\n", "Epoch 249/300\n", "0s - loss: 0.1346 - binary_accuracy: 1.0000\n", "Epoch 250/300\n", "0s - loss: 0.1341 - binary_accuracy: 1.0000\n", "Epoch 251/300\n", "0s - loss: 0.1336 - binary_accuracy: 1.0000\n", "Epoch 252/300\n", "0s - loss: 0.1331 - binary_accuracy: 1.0000\n", "Epoch 253/300\n", "0s - loss: 0.1327 - binary_accuracy: 1.0000\n", "Epoch 254/300\n", "0s - loss: 0.1322 - binary_accuracy: 1.0000\n", "Epoch 255/300\n", "0s - loss: 0.1317 - binary_accuracy: 1.0000\n", "Epoch 256/300\n", "0s - loss: 0.1313 - binary_accuracy: 1.0000\n", "Epoch 257/300\n", "0s - loss: 0.1308 - binary_accuracy: 1.0000\n", "Epoch 258/300\n", "0s - loss: 0.1303 - binary_accuracy: 1.0000\n", "Epoch 259/300\n", "0s - loss: 0.1299 - binary_accuracy: 1.0000\n", "Epoch 260/300\n", "0s - loss: 0.1294 - binary_accuracy: 1.0000\n", "Epoch 261/300\n", "0s - loss: 0.1289 - binary_accuracy: 1.0000\n", "Epoch 262/300\n", "0s - loss: 0.1284 - binary_accuracy: 1.0000\n", "Epoch 263/300\n", "0s - loss: 0.1280 - binary_accuracy: 1.0000\n", "Epoch 264/300\n", "0s - loss: 0.1275 - binary_accuracy: 1.0000\n", "Epoch 265/300\n", "0s - loss: 0.1270 - binary_accuracy: 1.0000\n", "Epoch 266/300\n", "0s - loss: 0.1266 - binary_accuracy: 1.0000\n", "Epoch 267/300\n", "0s - loss: 0.1262 - binary_accuracy: 1.0000\n", "Epoch 268/300\n", "0s - loss: 0.1257 - binary_accuracy: 1.0000\n", "Epoch 269/300\n", "0s - loss: 0.1253 - binary_accuracy: 1.0000\n", "Epoch 270/300\n", "0s - loss: 0.1248 - binary_accuracy: 1.0000\n", "Epoch 271/300\n", "0s - loss: 0.1243 - binary_accuracy: 1.0000\n", "Epoch 272/300\n", "0s - loss: 0.1239 - binary_accuracy: 1.0000\n", "Epoch 273/300\n", "0s - loss: 0.1234 - binary_accuracy: 1.0000\n", "Epoch 274/300\n", "0s - loss: 0.1230 - binary_accuracy: 1.0000\n", "Epoch 275/300\n", "0s - loss: 0.1226 - binary_accuracy: 1.0000\n", "Epoch 276/300\n", "0s - loss: 0.1221 - binary_accuracy: 1.0000\n", "Epoch 277/300\n", "0s - loss: 0.1217 - binary_accuracy: 1.0000\n", "Epoch 278/300\n", "0s - loss: 0.1212 - binary_accuracy: 1.0000\n", "Epoch 279/300\n", "0s - loss: 0.1208 - binary_accuracy: 1.0000\n", "Epoch 280/300\n", "0s - loss: 0.1204 - binary_accuracy: 1.0000\n", "Epoch 281/300\n", "0s - loss: 0.1199 - binary_accuracy: 1.0000\n", "Epoch 282/300\n", "0s - loss: 0.1195 - binary_accuracy: 1.0000\n", "Epoch 283/300\n", "0s - loss: 0.1190 - binary_accuracy: 1.0000\n", "Epoch 284/300\n", "0s - loss: 0.1186 - binary_accuracy: 1.0000\n", "Epoch 285/300\n", "0s - loss: 0.1182 - binary_accuracy: 1.0000\n", "Epoch 286/300\n", "0s - loss: 0.1177 - binary_accuracy: 1.0000\n", "Epoch 287/300\n", "0s - loss: 0.1173 - binary_accuracy: 1.0000\n", "Epoch 288/300\n", "0s - loss: 0.1169 - binary_accuracy: 1.0000\n", "Epoch 289/300\n", "0s - loss: 0.1165 - binary_accuracy: 1.0000\n", "Epoch 290/300\n", "0s - loss: 0.1160 - binary_accuracy: 1.0000\n", "Epoch 291/300\n", "0s - loss: 0.1156 - binary_accuracy: 1.0000\n", "Epoch 292/300\n", "0s - loss: 0.1152 - binary_accuracy: 1.0000\n", "Epoch 293/300\n", "0s - loss: 0.1148 - binary_accuracy: 1.0000\n", "Epoch 294/300\n", "0s - loss: 0.1143 - binary_accuracy: 1.0000\n", "Epoch 295/300\n", "0s - loss: 0.1139 - binary_accuracy: 1.0000\n", "Epoch 296/300\n", "0s - loss: 0.1135 - binary_accuracy: 1.0000\n", "Epoch 297/300\n", "0s - loss: 0.1131 - binary_accuracy: 1.0000\n", "Epoch 298/300\n", "0s - loss: 0.1126 - binary_accuracy: 1.0000\n", "Epoch 299/300\n", "0s - loss: 0.1122 - binary_accuracy: 1.0000\n", "Epoch 300/300\n", "0s - loss: 0.1118 - binary_accuracy: 1.0000\n", "\n", "Input after training:\n", " [[ 0. 0.]\n", " [ 0. 1.]\n", " [ 1. 0.]\n", " [ 1. 1.]]\n", "\n", "Prediction:\n", " [[ 0.]\n", " [ 1.]\n", " [ 1.]\n", " [ 0.]]\n" ] } ], "source": [ "model2 = Sequential()\n", "\n", "# add Dense Layers to model\n", "model2.add(Dense(32, input_dim = 2, activation = 'relu'))\n", "model2.add(Dense(1, activation = 'sigmoid'))\n", "\n", "model2.compile(loss = 'mean_squared_error', # the objective that the model will try to minimize\n", " optimizer = 'adam', # optimizer finds the right adjustments for the weights\n", " metrics = ['binary_accuracy']) # mectric to judge the performance of the model\n", "model2.fit(training_data, target_data, epochs=300, verbose=2)\n", "\n", "print(\"\\nInput after training:\\n\", training_data)\n", "print(\"\\nPrediction:\\n\", model2.predict(training_data).round())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "What if we add another layer?" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/300\n", "0s - loss: 0.2554 - binary_accuracy: 0.5000\n", "Epoch 2/300\n", "0s - loss: 0.2539 - binary_accuracy: 0.2500\n", "Epoch 3/300\n", "0s - loss: 0.2524 - binary_accuracy: 0.2500\n", "Epoch 4/300\n", "0s - loss: 0.2511 - binary_accuracy: 0.2500\n", "Epoch 5/300\n", "0s - loss: 0.2496 - binary_accuracy: 0.5000\n", "Epoch 6/300\n", "0s - loss: 0.2482 - binary_accuracy: 0.5000\n", "Epoch 7/300\n", "0s - loss: 0.2469 - binary_accuracy: 0.7500\n", "Epoch 8/300\n", "0s - loss: 0.2455 - binary_accuracy: 0.7500\n", "Epoch 9/300\n", "0s - loss: 0.2442 - binary_accuracy: 0.7500\n", "Epoch 10/300\n", "0s - loss: 0.2429 - binary_accuracy: 0.5000\n", "Epoch 11/300\n", "0s - loss: 0.2416 - binary_accuracy: 0.5000\n", "Epoch 12/300\n", "0s - loss: 0.2404 - binary_accuracy: 0.5000\n", "Epoch 13/300\n", "0s - loss: 0.2391 - binary_accuracy: 0.5000\n", "Epoch 14/300\n", "0s - loss: 0.2378 - binary_accuracy: 0.5000\n", "Epoch 15/300\n", "0s - loss: 0.2366 - binary_accuracy: 0.7500\n", "Epoch 16/300\n", "0s - loss: 0.2355 - binary_accuracy: 0.7500\n", "Epoch 17/300\n", "0s - loss: 0.2343 - binary_accuracy: 0.7500\n", "Epoch 18/300\n", "0s - loss: 0.2332 - binary_accuracy: 0.7500\n", "Epoch 19/300\n", "0s - loss: 0.2320 - binary_accuracy: 0.7500\n", "Epoch 20/300\n", "0s - loss: 0.2308 - binary_accuracy: 1.0000\n", "Epoch 21/300\n", "0s - loss: 0.2296 - binary_accuracy: 1.0000\n", "Epoch 22/300\n", "0s - loss: 0.2285 - binary_accuracy: 1.0000\n", "Epoch 23/300\n", "0s - loss: 0.2273 - binary_accuracy: 1.0000\n", "Epoch 24/300\n", "0s - loss: 0.2261 - binary_accuracy: 1.0000\n", "Epoch 25/300\n", "0s - loss: 0.2249 - binary_accuracy: 1.0000\n", "Epoch 26/300\n", "0s - loss: 0.2237 - binary_accuracy: 1.0000\n", "Epoch 27/300\n", "0s - loss: 0.2225 - binary_accuracy: 1.0000\n", "Epoch 28/300\n", "0s - loss: 0.2213 - binary_accuracy: 1.0000\n", "Epoch 29/300\n", "0s - loss: 0.2201 - binary_accuracy: 1.0000\n", "Epoch 30/300\n", "0s - loss: 0.2190 - binary_accuracy: 1.0000\n", "Epoch 31/300\n", "0s - loss: 0.2178 - binary_accuracy: 1.0000\n", "Epoch 32/300\n", "0s - loss: 0.2166 - binary_accuracy: 1.0000\n", "Epoch 33/300\n", "0s - loss: 0.2154 - binary_accuracy: 1.0000\n", "Epoch 34/300\n", "0s - loss: 0.2142 - binary_accuracy: 1.0000\n", "Epoch 35/300\n", "0s - loss: 0.2130 - binary_accuracy: 1.0000\n", "Epoch 36/300\n", "0s - loss: 0.2118 - binary_accuracy: 1.0000\n", "Epoch 37/300\n", "0s - loss: 0.2105 - binary_accuracy: 1.0000\n", "Epoch 38/300\n", "0s - loss: 0.2093 - binary_accuracy: 1.0000\n", "Epoch 39/300\n", "0s - loss: 0.2080 - binary_accuracy: 1.0000\n", "Epoch 40/300\n", "0s - loss: 0.2068 - binary_accuracy: 1.0000\n", "Epoch 41/300\n", "0s - loss: 0.2055 - binary_accuracy: 1.0000\n", "Epoch 42/300\n", "0s - loss: 0.2042 - binary_accuracy: 1.0000\n", "Epoch 43/300\n", "0s - loss: 0.2030 - binary_accuracy: 1.0000\n", "Epoch 44/300\n", "0s - loss: 0.2017 - binary_accuracy: 1.0000\n", "Epoch 45/300\n", "0s - loss: 0.2004 - binary_accuracy: 1.0000\n", "Epoch 46/300\n", "0s - loss: 0.1991 - binary_accuracy: 1.0000\n", "Epoch 47/300\n", "0s - loss: 0.1978 - binary_accuracy: 1.0000\n", "Epoch 48/300\n", "0s - loss: 0.1965 - binary_accuracy: 1.0000\n", "Epoch 49/300\n", "0s - loss: 0.1952 - binary_accuracy: 1.0000\n", "Epoch 50/300\n", "0s - loss: 0.1938 - binary_accuracy: 1.0000\n", "Epoch 51/300\n", "0s - loss: 0.1925 - binary_accuracy: 1.0000\n", "Epoch 52/300\n", "0s - loss: 0.1911 - binary_accuracy: 1.0000\n", "Epoch 53/300\n", "0s - loss: 0.1897 - binary_accuracy: 1.0000\n", "Epoch 54/300\n", "0s - loss: 0.1884 - binary_accuracy: 1.0000\n", "Epoch 55/300\n", "0s - loss: 0.1869 - binary_accuracy: 1.0000\n", "Epoch 56/300\n", "0s - loss: 0.1855 - binary_accuracy: 1.0000\n", "Epoch 57/300\n", "0s - loss: 0.1841 - binary_accuracy: 1.0000\n", "Epoch 58/300\n", "0s - loss: 0.1827 - binary_accuracy: 1.0000\n", "Epoch 59/300\n", "0s - loss: 0.1812 - binary_accuracy: 1.0000\n", "Epoch 60/300\n", "0s - loss: 0.1798 - binary_accuracy: 1.0000\n", "Epoch 61/300\n", "0s - loss: 0.1783 - binary_accuracy: 1.0000\n", "Epoch 62/300\n", "0s - loss: 0.1769 - binary_accuracy: 1.0000\n", "Epoch 63/300\n", "0s - loss: 0.1754 - binary_accuracy: 1.0000\n", "Epoch 64/300\n", "0s - loss: 0.1739 - binary_accuracy: 1.0000\n", "Epoch 65/300\n", "0s - loss: 0.1724 - binary_accuracy: 1.0000\n", "Epoch 66/300\n", "0s - loss: 0.1710 - binary_accuracy: 1.0000\n", "Epoch 67/300\n", "0s - loss: 0.1695 - binary_accuracy: 1.0000\n", "Epoch 68/300\n", "0s - loss: 0.1681 - binary_accuracy: 1.0000\n", "Epoch 69/300\n", "0s - loss: 0.1666 - binary_accuracy: 1.0000\n", "Epoch 70/300\n", "0s - loss: 0.1651 - binary_accuracy: 1.0000\n", "Epoch 71/300\n", "0s - loss: 0.1636 - binary_accuracy: 1.0000\n", "Epoch 72/300\n", "0s - loss: 0.1620 - binary_accuracy: 1.0000\n", "Epoch 73/300\n", "0s - loss: 0.1605 - binary_accuracy: 1.0000\n", "Epoch 74/300\n", "0s - loss: 0.1590 - binary_accuracy: 1.0000\n", "Epoch 75/300\n", "0s - loss: 0.1575 - binary_accuracy: 1.0000\n", "Epoch 76/300\n", "0s - loss: 0.1560 - binary_accuracy: 1.0000\n", "Epoch 77/300\n", "0s - loss: 0.1545 - binary_accuracy: 1.0000\n", "Epoch 78/300\n", "0s - loss: 0.1530 - binary_accuracy: 1.0000\n", "Epoch 79/300\n", "0s - loss: 0.1515 - binary_accuracy: 1.0000\n", "Epoch 80/300\n", "0s - loss: 0.1500 - binary_accuracy: 1.0000\n", "Epoch 81/300\n", "0s - loss: 0.1485 - binary_accuracy: 1.0000\n", "Epoch 82/300\n", "0s - loss: 0.1470 - binary_accuracy: 1.0000\n", "Epoch 83/300\n", "0s - loss: 0.1454 - binary_accuracy: 1.0000\n", "Epoch 84/300\n", "0s - loss: 0.1440 - binary_accuracy: 1.0000\n", "Epoch 85/300\n", "0s - loss: 0.1425 - binary_accuracy: 1.0000\n", "Epoch 86/300\n", "0s - loss: 0.1410 - binary_accuracy: 1.0000\n", "Epoch 87/300\n", "0s - loss: 0.1395 - binary_accuracy: 1.0000\n", "Epoch 88/300\n", "0s - loss: 0.1380 - binary_accuracy: 1.0000\n", "Epoch 89/300\n", "0s - loss: 0.1365 - binary_accuracy: 1.0000\n", "Epoch 90/300\n", "0s - loss: 0.1351 - binary_accuracy: 1.0000\n", "Epoch 91/300\n", "0s - loss: 0.1336 - binary_accuracy: 1.0000\n", "Epoch 92/300\n", "0s - loss: 0.1321 - binary_accuracy: 1.0000\n", "Epoch 93/300\n", "0s - loss: 0.1307 - binary_accuracy: 1.0000\n", "Epoch 94/300\n", "0s - loss: 0.1292 - binary_accuracy: 1.0000\n", "Epoch 95/300\n", "0s - loss: 0.1277 - binary_accuracy: 1.0000\n", "Epoch 96/300\n", "0s - loss: 0.1263 - binary_accuracy: 1.0000\n", "Epoch 97/300\n", "0s - loss: 0.1249 - binary_accuracy: 1.0000\n", "Epoch 98/300\n", "0s - loss: 0.1234 - binary_accuracy: 1.0000\n", "Epoch 99/300\n", "0s - loss: 0.1220 - binary_accuracy: 1.0000\n", "Epoch 100/300\n", "0s - loss: 0.1205 - binary_accuracy: 1.0000\n", "Epoch 101/300\n", "0s - loss: 0.1191 - binary_accuracy: 1.0000\n", "Epoch 102/300\n", "0s - loss: 0.1177 - binary_accuracy: 1.0000\n", "Epoch 103/300\n", "0s - loss: 0.1163 - binary_accuracy: 1.0000\n", "Epoch 104/300\n", "0s - loss: 0.1149 - binary_accuracy: 1.0000\n", "Epoch 105/300\n", "0s - loss: 0.1136 - binary_accuracy: 1.0000\n", "Epoch 106/300\n", "0s - loss: 0.1122 - binary_accuracy: 1.0000\n", "Epoch 107/300\n", "0s - loss: 0.1109 - binary_accuracy: 1.0000\n", "Epoch 108/300\n", "0s - loss: 0.1095 - binary_accuracy: 1.0000\n", "Epoch 109/300\n", "0s - loss: 0.1082 - binary_accuracy: 1.0000\n", "Epoch 110/300\n", "0s - loss: 0.1068 - binary_accuracy: 1.0000\n", "Epoch 111/300\n", "0s - loss: 0.1055 - binary_accuracy: 1.0000\n", "Epoch 112/300\n", "0s - loss: 0.1042 - binary_accuracy: 1.0000\n", "Epoch 113/300\n", "0s - loss: 0.1029 - binary_accuracy: 1.0000\n", "Epoch 114/300\n", "0s - loss: 0.1016 - binary_accuracy: 1.0000\n", "Epoch 115/300\n", "0s - loss: 0.1003 - binary_accuracy: 1.0000\n", "Epoch 116/300\n", "0s - loss: 0.0990 - binary_accuracy: 1.0000\n", "Epoch 117/300\n", "0s - loss: 0.0978 - binary_accuracy: 1.0000\n", "Epoch 118/300\n", "0s - loss: 0.0966 - binary_accuracy: 1.0000\n", "Epoch 119/300\n", "0s - loss: 0.0953 - binary_accuracy: 1.0000\n", "Epoch 120/300\n", "0s - loss: 0.0941 - binary_accuracy: 1.0000\n", "Epoch 121/300\n", "0s - loss: 0.0929 - binary_accuracy: 1.0000\n", "Epoch 122/300\n", "0s - loss: 0.0917 - binary_accuracy: 1.0000\n", "Epoch 123/300\n", "0s - loss: 0.0905 - binary_accuracy: 1.0000\n", "Epoch 124/300\n", "0s - loss: 0.0894 - binary_accuracy: 1.0000\n", "Epoch 125/300\n", "0s - loss: 0.0882 - binary_accuracy: 1.0000\n", "Epoch 126/300\n", "0s - loss: 0.0871 - binary_accuracy: 1.0000\n", "Epoch 127/300\n", "0s - loss: 0.0859 - binary_accuracy: 1.0000\n", "Epoch 128/300\n", "0s - loss: 0.0848 - binary_accuracy: 1.0000\n", "Epoch 129/300\n", "0s - loss: 0.0837 - binary_accuracy: 1.0000\n", "Epoch 130/300\n", "0s - loss: 0.0826 - binary_accuracy: 1.0000\n", "Epoch 131/300\n", "0s - loss: 0.0815 - binary_accuracy: 1.0000\n", "Epoch 132/300\n", "0s - loss: 0.0805 - binary_accuracy: 1.0000\n", "Epoch 133/300\n", "0s - loss: 0.0794 - binary_accuracy: 1.0000\n", "Epoch 134/300\n", "0s - loss: 0.0784 - binary_accuracy: 1.0000\n", "Epoch 135/300\n", "0s - loss: 0.0773 - binary_accuracy: 1.0000\n", "Epoch 136/300\n", "0s - loss: 0.0763 - binary_accuracy: 1.0000\n", "Epoch 137/300\n", "0s - loss: 0.0753 - binary_accuracy: 1.0000\n", "Epoch 138/300\n", "0s - loss: 0.0743 - binary_accuracy: 1.0000\n", "Epoch 139/300\n", "0s - loss: 0.0734 - binary_accuracy: 1.0000\n", "Epoch 140/300\n", "0s - loss: 0.0724 - binary_accuracy: 1.0000\n", "Epoch 141/300\n", "0s - loss: 0.0714 - binary_accuracy: 1.0000\n", "Epoch 142/300\n", "0s - loss: 0.0705 - binary_accuracy: 1.0000\n", "Epoch 143/300\n", "0s - loss: 0.0696 - binary_accuracy: 1.0000\n", "Epoch 144/300\n", "0s - loss: 0.0687 - binary_accuracy: 1.0000\n", "Epoch 145/300\n", "0s - loss: 0.0678 - binary_accuracy: 1.0000\n", "Epoch 146/300\n", "0s - loss: 0.0669 - binary_accuracy: 1.0000\n", "Epoch 147/300\n", "0s - loss: 0.0660 - binary_accuracy: 1.0000\n", "Epoch 148/300\n", "0s - loss: 0.0651 - binary_accuracy: 1.0000\n", "Epoch 149/300\n", "0s - loss: 0.0643 - binary_accuracy: 1.0000\n", "Epoch 150/300\n", "0s - loss: 0.0634 - binary_accuracy: 1.0000\n", "Epoch 151/300\n", "0s - loss: 0.0626 - binary_accuracy: 1.0000\n", "Epoch 152/300\n", "0s - loss: 0.0618 - binary_accuracy: 1.0000\n", "Epoch 153/300\n", "0s - loss: 0.0609 - binary_accuracy: 1.0000\n", "Epoch 154/300\n", "0s - loss: 0.0602 - binary_accuracy: 1.0000\n", "Epoch 155/300\n", "0s - loss: 0.0594 - binary_accuracy: 1.0000\n", "Epoch 156/300\n", "0s - loss: 0.0586 - binary_accuracy: 1.0000\n", "Epoch 157/300\n", "0s - loss: 0.0579 - binary_accuracy: 1.0000\n", "Epoch 158/300\n", "0s - loss: 0.0571 - binary_accuracy: 1.0000\n", "Epoch 159/300\n", "0s - loss: 0.0563 - binary_accuracy: 1.0000\n", "Epoch 160/300\n", "0s - loss: 0.0556 - binary_accuracy: 1.0000\n", "Epoch 161/300\n", "0s - loss: 0.0549 - binary_accuracy: 1.0000\n", "Epoch 162/300\n", "0s - loss: 0.0542 - binary_accuracy: 1.0000\n", "Epoch 163/300\n", "0s - loss: 0.0535 - binary_accuracy: 1.0000\n", "Epoch 164/300\n", "0s - loss: 0.0528 - binary_accuracy: 1.0000\n", "Epoch 165/300\n", "0s - loss: 0.0521 - binary_accuracy: 1.0000\n", "Epoch 166/300\n", "0s - loss: 0.0514 - binary_accuracy: 1.0000\n", "Epoch 167/300\n", "0s - loss: 0.0508 - binary_accuracy: 1.0000\n", "Epoch 168/300\n", "0s - loss: 0.0501 - binary_accuracy: 1.0000\n", "Epoch 169/300\n", "0s - loss: 0.0494 - binary_accuracy: 1.0000\n", "Epoch 170/300\n", "0s - loss: 0.0488 - binary_accuracy: 1.0000\n", "Epoch 171/300\n", "0s - loss: 0.0481 - binary_accuracy: 1.0000\n", "Epoch 172/300\n", "0s - loss: 0.0475 - binary_accuracy: 1.0000\n", "Epoch 173/300\n", "0s - loss: 0.0469 - binary_accuracy: 1.0000\n", "Epoch 174/300\n", "0s - loss: 0.0462 - binary_accuracy: 1.0000\n", "Epoch 175/300\n", "0s - loss: 0.0456 - binary_accuracy: 1.0000\n", "Epoch 176/300\n", "0s - loss: 0.0450 - binary_accuracy: 1.0000\n", "Epoch 177/300\n", "0s - loss: 0.0444 - binary_accuracy: 1.0000\n", "Epoch 178/300\n", "0s - loss: 0.0438 - binary_accuracy: 1.0000\n", "Epoch 179/300\n", "0s - loss: 0.0433 - binary_accuracy: 1.0000\n", "Epoch 180/300\n", "0s - loss: 0.0427 - binary_accuracy: 1.0000\n", "Epoch 181/300\n", "0s - loss: 0.0421 - binary_accuracy: 1.0000\n", "Epoch 182/300\n", "0s - loss: 0.0416 - binary_accuracy: 1.0000\n", "Epoch 183/300\n", "0s - loss: 0.0410 - binary_accuracy: 1.0000\n", "Epoch 184/300\n", "0s - loss: 0.0405 - binary_accuracy: 1.0000\n", "Epoch 185/300\n", "0s - loss: 0.0399 - binary_accuracy: 1.0000\n", "Epoch 186/300\n", "0s - loss: 0.0394 - binary_accuracy: 1.0000\n", "Epoch 187/300\n", "0s - loss: 0.0389 - binary_accuracy: 1.0000\n", "Epoch 188/300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "0s - loss: 0.0384 - binary_accuracy: 1.0000\n", "Epoch 189/300\n", "0s - loss: 0.0378 - binary_accuracy: 1.0000\n", "Epoch 190/300\n", "0s - loss: 0.0374 - binary_accuracy: 1.0000\n", "Epoch 191/300\n", "0s - loss: 0.0369 - binary_accuracy: 1.0000\n", "Epoch 192/300\n", "0s - loss: 0.0364 - binary_accuracy: 1.0000\n", "Epoch 193/300\n", "0s - loss: 0.0359 - binary_accuracy: 1.0000\n", "Epoch 194/300\n", "0s - loss: 0.0354 - binary_accuracy: 1.0000\n", "Epoch 195/300\n", "0s - loss: 0.0349 - binary_accuracy: 1.0000\n", "Epoch 196/300\n", "0s - loss: 0.0345 - binary_accuracy: 1.0000\n", "Epoch 197/300\n", "0s - loss: 0.0340 - binary_accuracy: 1.0000\n", "Epoch 198/300\n", "0s - loss: 0.0336 - binary_accuracy: 1.0000\n", "Epoch 199/300\n", "0s - loss: 0.0331 - binary_accuracy: 1.0000\n", "Epoch 200/300\n", "0s - loss: 0.0327 - binary_accuracy: 1.0000\n", "Epoch 201/300\n", "0s - loss: 0.0323 - binary_accuracy: 1.0000\n", "Epoch 202/300\n", "0s - loss: 0.0318 - binary_accuracy: 1.0000\n", "Epoch 203/300\n", "0s - loss: 0.0314 - binary_accuracy: 1.0000\n", "Epoch 204/300\n", "0s - loss: 0.0310 - binary_accuracy: 1.0000\n", "Epoch 205/300\n", "0s - loss: 0.0306 - binary_accuracy: 1.0000\n", "Epoch 206/300\n", "0s - loss: 0.0302 - binary_accuracy: 1.0000\n", "Epoch 207/300\n", "0s - loss: 0.0298 - binary_accuracy: 1.0000\n", "Epoch 208/300\n", "0s - loss: 0.0294 - binary_accuracy: 1.0000\n", "Epoch 209/300\n", "0s - loss: 0.0290 - binary_accuracy: 1.0000\n", "Epoch 210/300\n", "0s - loss: 0.0286 - binary_accuracy: 1.0000\n", "Epoch 211/300\n", "0s - loss: 0.0282 - binary_accuracy: 1.0000\n", "Epoch 212/300\n", "0s - loss: 0.0279 - binary_accuracy: 1.0000\n", "Epoch 213/300\n", "0s - loss: 0.0275 - binary_accuracy: 1.0000\n", "Epoch 214/300\n", "0s - loss: 0.0271 - binary_accuracy: 1.0000\n", "Epoch 215/300\n", "0s - loss: 0.0268 - binary_accuracy: 1.0000\n", "Epoch 216/300\n", "0s - loss: 0.0264 - binary_accuracy: 1.0000\n", "Epoch 217/300\n", "0s - loss: 0.0261 - binary_accuracy: 1.0000\n", "Epoch 218/300\n", "0s - loss: 0.0257 - binary_accuracy: 1.0000\n", "Epoch 219/300\n", "0s - loss: 0.0254 - binary_accuracy: 1.0000\n", "Epoch 220/300\n", "0s - loss: 0.0250 - binary_accuracy: 1.0000\n", "Epoch 221/300\n", "0s - loss: 0.0247 - binary_accuracy: 1.0000\n", "Epoch 222/300\n", "0s - loss: 0.0244 - binary_accuracy: 1.0000\n", "Epoch 223/300\n", "0s - loss: 0.0241 - binary_accuracy: 1.0000\n", "Epoch 224/300\n", "0s - loss: 0.0238 - binary_accuracy: 1.0000\n", "Epoch 225/300\n", "0s - loss: 0.0234 - binary_accuracy: 1.0000\n", "Epoch 226/300\n", "0s - loss: 0.0231 - binary_accuracy: 1.0000\n", "Epoch 227/300\n", "0s - loss: 0.0228 - binary_accuracy: 1.0000\n", "Epoch 228/300\n", "0s - loss: 0.0225 - binary_accuracy: 1.0000\n", "Epoch 229/300\n", "0s - loss: 0.0222 - binary_accuracy: 1.0000\n", "Epoch 230/300\n", "0s - loss: 0.0219 - binary_accuracy: 1.0000\n", "Epoch 231/300\n", "0s - loss: 0.0216 - binary_accuracy: 1.0000\n", "Epoch 232/300\n", "0s - loss: 0.0214 - binary_accuracy: 1.0000\n", "Epoch 233/300\n", "0s - loss: 0.0211 - binary_accuracy: 1.0000\n", "Epoch 234/300\n", "0s - loss: 0.0208 - binary_accuracy: 1.0000\n", "Epoch 235/300\n", "0s - loss: 0.0205 - binary_accuracy: 1.0000\n", "Epoch 236/300\n", "0s - loss: 0.0203 - binary_accuracy: 1.0000\n", "Epoch 237/300\n", "0s - loss: 0.0200 - binary_accuracy: 1.0000\n", "Epoch 238/300\n", "0s - loss: 0.0197 - binary_accuracy: 1.0000\n", "Epoch 239/300\n", "0s - loss: 0.0195 - binary_accuracy: 1.0000\n", "Epoch 240/300\n", "0s - loss: 0.0192 - binary_accuracy: 1.0000\n", "Epoch 241/300\n", "0s - loss: 0.0190 - binary_accuracy: 1.0000\n", "Epoch 242/300\n", "0s - loss: 0.0187 - binary_accuracy: 1.0000\n", "Epoch 243/300\n", "0s - loss: 0.0185 - binary_accuracy: 1.0000\n", "Epoch 244/300\n", "0s - loss: 0.0182 - binary_accuracy: 1.0000\n", "Epoch 245/300\n", "0s - loss: 0.0180 - binary_accuracy: 1.0000\n", "Epoch 246/300\n", "0s - loss: 0.0178 - binary_accuracy: 1.0000\n", "Epoch 247/300\n", "0s - loss: 0.0175 - binary_accuracy: 1.0000\n", "Epoch 248/300\n", "0s - loss: 0.0173 - binary_accuracy: 1.0000\n", "Epoch 249/300\n", "0s - loss: 0.0171 - binary_accuracy: 1.0000\n", "Epoch 250/300\n", "0s - loss: 0.0169 - binary_accuracy: 1.0000\n", "Epoch 251/300\n", "0s - loss: 0.0166 - binary_accuracy: 1.0000\n", "Epoch 252/300\n", "0s - loss: 0.0164 - binary_accuracy: 1.0000\n", "Epoch 253/300\n", "0s - loss: 0.0162 - binary_accuracy: 1.0000\n", "Epoch 254/300\n", "0s - loss: 0.0160 - binary_accuracy: 1.0000\n", "Epoch 255/300\n", "0s - loss: 0.0158 - binary_accuracy: 1.0000\n", "Epoch 256/300\n", "0s - loss: 0.0156 - binary_accuracy: 1.0000\n", "Epoch 257/300\n", "0s - loss: 0.0154 - binary_accuracy: 1.0000\n", "Epoch 258/300\n", "0s - loss: 0.0152 - binary_accuracy: 1.0000\n", "Epoch 259/300\n", "0s - loss: 0.0150 - binary_accuracy: 1.0000\n", "Epoch 260/300\n", "0s - loss: 0.0148 - binary_accuracy: 1.0000\n", "Epoch 261/300\n", "0s - loss: 0.0146 - binary_accuracy: 1.0000\n", "Epoch 262/300\n", "0s - loss: 0.0144 - binary_accuracy: 1.0000\n", "Epoch 263/300\n", "0s - loss: 0.0142 - binary_accuracy: 1.0000\n", "Epoch 264/300\n", "0s - loss: 0.0141 - binary_accuracy: 1.0000\n", "Epoch 265/300\n", "0s - loss: 0.0139 - binary_accuracy: 1.0000\n", "Epoch 266/300\n", "0s - loss: 0.0137 - binary_accuracy: 1.0000\n", "Epoch 267/300\n", "0s - loss: 0.0135 - binary_accuracy: 1.0000\n", "Epoch 268/300\n", "0s - loss: 0.0134 - binary_accuracy: 1.0000\n", "Epoch 269/300\n", "0s - loss: 0.0132 - binary_accuracy: 1.0000\n", "Epoch 270/300\n", "0s - loss: 0.0130 - binary_accuracy: 1.0000\n", "Epoch 271/300\n", "0s - loss: 0.0129 - binary_accuracy: 1.0000\n", "Epoch 272/300\n", "0s - loss: 0.0127 - binary_accuracy: 1.0000\n", "Epoch 273/300\n", "0s - loss: 0.0126 - binary_accuracy: 1.0000\n", "Epoch 274/300\n", "0s - loss: 0.0124 - binary_accuracy: 1.0000\n", "Epoch 275/300\n", "0s - loss: 0.0122 - binary_accuracy: 1.0000\n", "Epoch 276/300\n", "0s - loss: 0.0121 - binary_accuracy: 1.0000\n", "Epoch 277/300\n", "0s - loss: 0.0119 - binary_accuracy: 1.0000\n", "Epoch 278/300\n", "0s - loss: 0.0118 - binary_accuracy: 1.0000\n", "Epoch 279/300\n", "0s - loss: 0.0117 - binary_accuracy: 1.0000\n", "Epoch 280/300\n", "0s - loss: 0.0115 - binary_accuracy: 1.0000\n", "Epoch 281/300\n", "0s - loss: 0.0114 - binary_accuracy: 1.0000\n", "Epoch 282/300\n", "0s - loss: 0.0112 - binary_accuracy: 1.0000\n", "Epoch 283/300\n", "0s - loss: 0.0111 - binary_accuracy: 1.0000\n", "Epoch 284/300\n", "0s - loss: 0.0110 - binary_accuracy: 1.0000\n", "Epoch 285/300\n", "0s - loss: 0.0108 - binary_accuracy: 1.0000\n", "Epoch 286/300\n", "0s - loss: 0.0107 - binary_accuracy: 1.0000\n", "Epoch 287/300\n", "0s - loss: 0.0106 - binary_accuracy: 1.0000\n", "Epoch 288/300\n", "0s - loss: 0.0104 - binary_accuracy: 1.0000\n", "Epoch 289/300\n", "0s - loss: 0.0103 - binary_accuracy: 1.0000\n", "Epoch 290/300\n", "0s - loss: 0.0102 - binary_accuracy: 1.0000\n", "Epoch 291/300\n", "0s - loss: 0.0101 - binary_accuracy: 1.0000\n", "Epoch 292/300\n", "0s - loss: 0.0099 - binary_accuracy: 1.0000\n", "Epoch 293/300\n", "0s - loss: 0.0098 - binary_accuracy: 1.0000\n", "Epoch 294/300\n", "0s - loss: 0.0097 - binary_accuracy: 1.0000\n", "Epoch 295/300\n", "0s - loss: 0.0096 - binary_accuracy: 1.0000\n", "Epoch 296/300\n", "0s - loss: 0.0095 - binary_accuracy: 1.0000\n", "Epoch 297/300\n", "0s - loss: 0.0094 - binary_accuracy: 1.0000\n", "Epoch 298/300\n", "0s - loss: 0.0093 - binary_accuracy: 1.0000\n", "Epoch 299/300\n", "0s - loss: 0.0092 - binary_accuracy: 1.0000\n", "Epoch 300/300\n", "0s - loss: 0.0091 - binary_accuracy: 1.0000\n", "\n", "Input after training:\n", " [[ 0. 0.]\n", " [ 0. 1.]\n", " [ 1. 0.]\n", " [ 1. 1.]]\n", "\n", "Prediction:\n", " [[ 0.]\n", " [ 1.]\n", " [ 1.]\n", " [ 0.]]\n" ] } ], "source": [ "model3 = Sequential()\n", "\n", "# add Dense Layers to model\n", "model3.add(Dense(32, input_dim = 2, activation = 'relu'))\n", "model3.add(Dense(32, input_dim = 2, activation = 'relu'))\n", "model3.add(Dense(1, activation = 'sigmoid'))\n", "\n", "model3.compile(loss = 'mean_squared_error', # the objective that the model will try to minimize\n", " optimizer = 'adam', # optimizer finds the right adjustments for the weights\n", " metrics = ['binary_accuracy']) # mectric to judge the performance of the model\n", "model3.fit(training_data, target_data, epochs=300, verbose=2)\n", "\n", "print(\"\\nInput after training:\\n\", training_data)\n", "print(\"\\nPrediction:\\n\", model2.predict(training_data).round())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "But would that be the same as just using one hidden layer with a size of 64?" ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/300\n", "0s - loss: 0.2593 - binary_accuracy: 0.5000\n", "Epoch 2/300\n", "0s - loss: 0.2586 - binary_accuracy: 0.5000\n", "Epoch 3/300\n", "0s - loss: 0.2578 - binary_accuracy: 0.5000\n", "Epoch 4/300\n", "0s - loss: 0.2570 - binary_accuracy: 0.5000\n", "Epoch 5/300\n", "0s - loss: 0.2562 - binary_accuracy: 0.5000\n", "Epoch 6/300\n", "0s - loss: 0.2554 - binary_accuracy: 0.5000\n", "Epoch 7/300\n", "0s - loss: 0.2546 - binary_accuracy: 0.5000\n", "Epoch 8/300\n", "0s - loss: 0.2538 - binary_accuracy: 0.5000\n", "Epoch 9/300\n", "0s - loss: 0.2530 - binary_accuracy: 0.5000\n", "Epoch 10/300\n", "0s - loss: 0.2522 - binary_accuracy: 0.5000\n", "Epoch 11/300\n", "0s - loss: 0.2514 - binary_accuracy: 0.5000\n", "Epoch 12/300\n", "0s - loss: 0.2507 - binary_accuracy: 0.5000\n", "Epoch 13/300\n", "0s - loss: 0.2499 - binary_accuracy: 0.5000\n", "Epoch 14/300\n", "0s - loss: 0.2492 - binary_accuracy: 0.5000\n", "Epoch 15/300\n", "0s - loss: 0.2484 - binary_accuracy: 0.5000\n", "Epoch 16/300\n", "0s - loss: 0.2477 - binary_accuracy: 0.5000\n", "Epoch 17/300\n", "0s - loss: 0.2469 - binary_accuracy: 0.5000\n", "Epoch 18/300\n", "0s - loss: 0.2462 - binary_accuracy: 0.5000\n", "Epoch 19/300\n", "0s - loss: 0.2455 - binary_accuracy: 0.5000\n", "Epoch 20/300\n", "0s - loss: 0.2447 - binary_accuracy: 0.7500\n", "Epoch 21/300\n", "0s - loss: 0.2440 - binary_accuracy: 0.7500\n", "Epoch 22/300\n", "0s - loss: 0.2433 - binary_accuracy: 0.7500\n", "Epoch 23/300\n", "0s - loss: 0.2426 - binary_accuracy: 0.7500\n", "Epoch 24/300\n", "0s - loss: 0.2419 - binary_accuracy: 0.7500\n", "Epoch 25/300\n", "0s - loss: 0.2412 - binary_accuracy: 0.7500\n", "Epoch 26/300\n", "0s - loss: 0.2405 - binary_accuracy: 0.7500\n", "Epoch 27/300\n", "0s - loss: 0.2397 - binary_accuracy: 0.7500\n", "Epoch 28/300\n", "0s - loss: 0.2391 - binary_accuracy: 0.7500\n", "Epoch 29/300\n", "0s - loss: 0.2384 - binary_accuracy: 0.7500\n", "Epoch 30/300\n", "0s - loss: 0.2377 - binary_accuracy: 0.7500\n", "Epoch 31/300\n", "0s - loss: 0.2370 - binary_accuracy: 0.7500\n", "Epoch 32/300\n", "0s - loss: 0.2363 - binary_accuracy: 0.7500\n", "Epoch 33/300\n", "0s - loss: 0.2356 - binary_accuracy: 1.0000\n", "Epoch 34/300\n", "0s - loss: 0.2349 - binary_accuracy: 1.0000\n", "Epoch 35/300\n", "0s - loss: 0.2342 - binary_accuracy: 1.0000\n", "Epoch 36/300\n", "0s - loss: 0.2335 - binary_accuracy: 1.0000\n", "Epoch 37/300\n", "0s - loss: 0.2329 - binary_accuracy: 1.0000\n", "Epoch 38/300\n", "0s - loss: 0.2322 - binary_accuracy: 1.0000\n", "Epoch 39/300\n", "0s - loss: 0.2315 - binary_accuracy: 1.0000\n", "Epoch 40/300\n", "0s - loss: 0.2308 - binary_accuracy: 1.0000\n", "Epoch 41/300\n", "0s - loss: 0.2302 - binary_accuracy: 1.0000\n", "Epoch 42/300\n", "0s - loss: 0.2295 - binary_accuracy: 1.0000\n", "Epoch 43/300\n", "0s - loss: 0.2289 - binary_accuracy: 1.0000\n", "Epoch 44/300\n", "0s - loss: 0.2282 - binary_accuracy: 1.0000\n", "Epoch 45/300\n", "0s - loss: 0.2276 - binary_accuracy: 1.0000\n", "Epoch 46/300\n", "0s - loss: 0.2269 - binary_accuracy: 1.0000\n", "Epoch 47/300\n", "0s - loss: 0.2263 - binary_accuracy: 1.0000\n", "Epoch 48/300\n", "0s - loss: 0.2257 - binary_accuracy: 1.0000\n", "Epoch 49/300\n", "0s - loss: 0.2251 - binary_accuracy: 1.0000\n", "Epoch 50/300\n", "0s - loss: 0.2244 - binary_accuracy: 1.0000\n", "Epoch 51/300\n", "0s - loss: 0.2238 - binary_accuracy: 1.0000\n", "Epoch 52/300\n", "0s - loss: 0.2232 - binary_accuracy: 1.0000\n", "Epoch 53/300\n", "0s - loss: 0.2226 - binary_accuracy: 1.0000\n", "Epoch 54/300\n", "0s - loss: 0.2220 - binary_accuracy: 1.0000\n", "Epoch 55/300\n", "0s - loss: 0.2214 - binary_accuracy: 1.0000\n", "Epoch 56/300\n", "0s - loss: 0.2208 - binary_accuracy: 1.0000\n", "Epoch 57/300\n", "0s - loss: 0.2201 - binary_accuracy: 1.0000\n", "Epoch 58/300\n", "0s - loss: 0.2195 - binary_accuracy: 1.0000\n", "Epoch 59/300\n", "0s - loss: 0.2189 - binary_accuracy: 1.0000\n", "Epoch 60/300\n", "0s - loss: 0.2183 - binary_accuracy: 1.0000\n", "Epoch 61/300\n", "0s - loss: 0.2177 - binary_accuracy: 1.0000\n", "Epoch 62/300\n", "0s - loss: 0.2171 - binary_accuracy: 1.0000\n", "Epoch 63/300\n", "0s - loss: 0.2165 - binary_accuracy: 1.0000\n", "Epoch 64/300\n", "0s - loss: 0.2159 - binary_accuracy: 1.0000\n", "Epoch 65/300\n", "0s - loss: 0.2153 - binary_accuracy: 1.0000\n", "Epoch 66/300\n", "0s - loss: 0.2146 - binary_accuracy: 1.0000\n", "Epoch 67/300\n", "0s - loss: 0.2140 - binary_accuracy: 1.0000\n", "Epoch 68/300\n", "0s - loss: 0.2134 - binary_accuracy: 1.0000\n", "Epoch 69/300\n", "0s - loss: 0.2128 - binary_accuracy: 1.0000\n", "Epoch 70/300\n", "0s - loss: 0.2122 - binary_accuracy: 1.0000\n", "Epoch 71/300\n", "0s - loss: 0.2116 - binary_accuracy: 1.0000\n", "Epoch 72/300\n", "0s - loss: 0.2109 - binary_accuracy: 1.0000\n", "Epoch 73/300\n", "0s - loss: 0.2103 - binary_accuracy: 1.0000\n", "Epoch 74/300\n", "0s - loss: 0.2097 - binary_accuracy: 1.0000\n", "Epoch 75/300\n", "0s - loss: 0.2091 - binary_accuracy: 1.0000\n", "Epoch 76/300\n", "0s - loss: 0.2085 - binary_accuracy: 1.0000\n", "Epoch 77/300\n", "0s - loss: 0.2079 - binary_accuracy: 1.0000\n", "Epoch 78/300\n", "0s - loss: 0.2073 - binary_accuracy: 1.0000\n", "Epoch 79/300\n", "0s - loss: 0.2067 - binary_accuracy: 1.0000\n", "Epoch 80/300\n", "0s - loss: 0.2060 - binary_accuracy: 1.0000\n", "Epoch 81/300\n", "0s - loss: 0.2054 - binary_accuracy: 1.0000\n", "Epoch 82/300\n", "0s - loss: 0.2048 - binary_accuracy: 1.0000\n", "Epoch 83/300\n", "0s - loss: 0.2042 - binary_accuracy: 1.0000\n", "Epoch 84/300\n", "0s - loss: 0.2035 - binary_accuracy: 1.0000\n", "Epoch 85/300\n", "0s - loss: 0.2029 - binary_accuracy: 1.0000\n", "Epoch 86/300\n", "0s - loss: 0.2023 - binary_accuracy: 1.0000\n", "Epoch 87/300\n", "0s - loss: 0.2017 - binary_accuracy: 1.0000\n", "Epoch 88/300\n", "0s - loss: 0.2011 - binary_accuracy: 1.0000\n", "Epoch 89/300\n", "0s - loss: 0.2005 - binary_accuracy: 1.0000\n", "Epoch 90/300\n", "0s - loss: 0.1999 - binary_accuracy: 1.0000\n", "Epoch 91/300\n", "0s - loss: 0.1993 - binary_accuracy: 1.0000\n", "Epoch 92/300\n", "0s - loss: 0.1987 - binary_accuracy: 1.0000\n", "Epoch 93/300\n", "0s - loss: 0.1980 - binary_accuracy: 1.0000\n", "Epoch 94/300\n", "0s - loss: 0.1974 - binary_accuracy: 1.0000\n", "Epoch 95/300\n", "0s - loss: 0.1968 - binary_accuracy: 1.0000\n", "Epoch 96/300\n", "0s - loss: 0.1962 - binary_accuracy: 1.0000\n", "Epoch 97/300\n", "0s - loss: 0.1956 - binary_accuracy: 1.0000\n", "Epoch 98/300\n", "0s - loss: 0.1950 - binary_accuracy: 1.0000\n", "Epoch 99/300\n", "0s - loss: 0.1944 - binary_accuracy: 1.0000\n", "Epoch 100/300\n", "0s - loss: 0.1938 - binary_accuracy: 1.0000\n", "Epoch 101/300\n", "0s - loss: 0.1932 - binary_accuracy: 1.0000\n", "Epoch 102/300\n", "0s - loss: 0.1925 - binary_accuracy: 1.0000\n", "Epoch 103/300\n", "0s - loss: 0.1919 - binary_accuracy: 1.0000\n", "Epoch 104/300\n", "0s - loss: 0.1913 - binary_accuracy: 1.0000\n", "Epoch 105/300\n", "0s - loss: 0.1907 - binary_accuracy: 1.0000\n", "Epoch 106/300\n", "0s - loss: 0.1900 - binary_accuracy: 1.0000\n", "Epoch 107/300\n", "0s - loss: 0.1894 - binary_accuracy: 1.0000\n", "Epoch 108/300\n", "0s - loss: 0.1888 - binary_accuracy: 1.0000\n", "Epoch 109/300\n", "0s - loss: 0.1882 - binary_accuracy: 1.0000\n", "Epoch 110/300\n", "0s - loss: 0.1875 - binary_accuracy: 1.0000\n", "Epoch 111/300\n", "0s - loss: 0.1869 - binary_accuracy: 1.0000\n", "Epoch 112/300\n", "0s - loss: 0.1862 - binary_accuracy: 1.0000\n", "Epoch 113/300\n", "0s - loss: 0.1856 - binary_accuracy: 1.0000\n", "Epoch 114/300\n", "0s - loss: 0.1849 - binary_accuracy: 1.0000\n", "Epoch 115/300\n", "0s - loss: 0.1843 - binary_accuracy: 1.0000\n", "Epoch 116/300\n", "0s - loss: 0.1836 - binary_accuracy: 1.0000\n", "Epoch 117/300\n", "0s - loss: 0.1830 - binary_accuracy: 1.0000\n", "Epoch 118/300\n", "0s - loss: 0.1823 - binary_accuracy: 1.0000\n", "Epoch 119/300\n", "0s - loss: 0.1817 - binary_accuracy: 1.0000\n", "Epoch 120/300\n", "0s - loss: 0.1810 - binary_accuracy: 1.0000\n", "Epoch 121/300\n", "0s - loss: 0.1804 - binary_accuracy: 1.0000\n", "Epoch 122/300\n", "0s - loss: 0.1797 - binary_accuracy: 1.0000\n", "Epoch 123/300\n", "0s - loss: 0.1790 - binary_accuracy: 1.0000\n", "Epoch 124/300\n", "0s - loss: 0.1784 - binary_accuracy: 1.0000\n", "Epoch 125/300\n", "0s - loss: 0.1777 - binary_accuracy: 1.0000\n", "Epoch 126/300\n", "0s - loss: 0.1771 - binary_accuracy: 1.0000\n", "Epoch 127/300\n", "0s - loss: 0.1764 - binary_accuracy: 1.0000\n", "Epoch 128/300\n", "0s - loss: 0.1758 - binary_accuracy: 1.0000\n", "Epoch 129/300\n", "0s - loss: 0.1751 - binary_accuracy: 1.0000\n", "Epoch 130/300\n", "0s - loss: 0.1744 - binary_accuracy: 1.0000\n", "Epoch 131/300\n", "0s - loss: 0.1737 - binary_accuracy: 1.0000\n", "Epoch 132/300\n", "0s - loss: 0.1731 - binary_accuracy: 1.0000\n", "Epoch 133/300\n", "0s - loss: 0.1724 - binary_accuracy: 1.0000\n", "Epoch 134/300\n", "0s - loss: 0.1718 - binary_accuracy: 1.0000\n", "Epoch 135/300\n", "0s - loss: 0.1711 - binary_accuracy: 1.0000\n", "Epoch 136/300\n", "0s - loss: 0.1704 - binary_accuracy: 1.0000\n", "Epoch 137/300\n", "0s - loss: 0.1698 - binary_accuracy: 1.0000\n", "Epoch 138/300\n", "0s - loss: 0.1691 - binary_accuracy: 1.0000\n", "Epoch 139/300\n", "0s - loss: 0.1684 - binary_accuracy: 1.0000\n", "Epoch 140/300\n", "0s - loss: 0.1677 - binary_accuracy: 1.0000\n", "Epoch 141/300\n", "0s - loss: 0.1671 - binary_accuracy: 1.0000\n", "Epoch 142/300\n", "0s - loss: 0.1664 - binary_accuracy: 1.0000\n", "Epoch 143/300\n", "0s - loss: 0.1657 - binary_accuracy: 1.0000\n", "Epoch 144/300\n", "0s - loss: 0.1651 - binary_accuracy: 1.0000\n", "Epoch 145/300\n", "0s - loss: 0.1644 - binary_accuracy: 1.0000\n", "Epoch 146/300\n", "0s - loss: 0.1637 - binary_accuracy: 1.0000\n", "Epoch 147/300\n", "0s - loss: 0.1630 - binary_accuracy: 1.0000\n", "Epoch 148/300\n", "0s - loss: 0.1624 - binary_accuracy: 1.0000\n", "Epoch 149/300\n", "0s - loss: 0.1617 - binary_accuracy: 1.0000\n", "Epoch 150/300\n", "0s - loss: 0.1610 - binary_accuracy: 1.0000\n", "Epoch 151/300\n", "0s - loss: 0.1603 - binary_accuracy: 1.0000\n", "Epoch 152/300\n", "0s - loss: 0.1596 - binary_accuracy: 1.0000\n", "Epoch 153/300\n", "0s - loss: 0.1590 - binary_accuracy: 1.0000\n", "Epoch 154/300\n", "0s - loss: 0.1583 - binary_accuracy: 1.0000\n", "Epoch 155/300\n", "0s - loss: 0.1576 - binary_accuracy: 1.0000\n", "Epoch 156/300\n", "0s - loss: 0.1569 - binary_accuracy: 1.0000\n", "Epoch 157/300\n", "0s - loss: 0.1562 - binary_accuracy: 1.0000\n", "Epoch 158/300\n", "0s - loss: 0.1556 - binary_accuracy: 1.0000\n", "Epoch 159/300\n", "0s - loss: 0.1549 - binary_accuracy: 1.0000\n", "Epoch 160/300\n", "0s - loss: 0.1542 - binary_accuracy: 1.0000\n", "Epoch 161/300\n", "0s - loss: 0.1535 - binary_accuracy: 1.0000\n", "Epoch 162/300\n", "0s - loss: 0.1528 - binary_accuracy: 1.0000\n", "Epoch 163/300\n", "0s - loss: 0.1521 - binary_accuracy: 1.0000\n", "Epoch 164/300\n", "0s - loss: 0.1515 - binary_accuracy: 1.0000\n", "Epoch 165/300\n", "0s - loss: 0.1508 - binary_accuracy: 1.0000\n", "Epoch 166/300\n", "0s - loss: 0.1501 - binary_accuracy: 1.0000\n", "Epoch 167/300\n", "0s - loss: 0.1494 - binary_accuracy: 1.0000\n", "Epoch 168/300\n", "0s - loss: 0.1487 - binary_accuracy: 1.0000\n", "Epoch 169/300\n", "0s - loss: 0.1481 - binary_accuracy: 1.0000\n", "Epoch 170/300\n", "0s - loss: 0.1474 - binary_accuracy: 1.0000\n", "Epoch 171/300\n", "0s - loss: 0.1467 - binary_accuracy: 1.0000\n", "Epoch 172/300\n", "0s - loss: 0.1460 - binary_accuracy: 1.0000\n", "Epoch 173/300\n", "0s - loss: 0.1453 - binary_accuracy: 1.0000\n", "Epoch 174/300\n", "0s - loss: 0.1447 - binary_accuracy: 1.0000\n", "Epoch 175/300\n", "0s - loss: 0.1440 - binary_accuracy: 1.0000\n", "Epoch 176/300\n", "0s - loss: 0.1433 - binary_accuracy: 1.0000\n", "Epoch 177/300\n", "0s - loss: 0.1426 - binary_accuracy: 1.0000\n", "Epoch 178/300\n", "0s - loss: 0.1419 - binary_accuracy: 1.0000\n", "Epoch 179/300\n", "0s - loss: 0.1413 - binary_accuracy: 1.0000\n", "Epoch 180/300\n", "0s - loss: 0.1406 - binary_accuracy: 1.0000\n", "Epoch 181/300\n", "0s - loss: 0.1399 - binary_accuracy: 1.0000\n", "Epoch 182/300\n", "0s - loss: 0.1392 - binary_accuracy: 1.0000\n", "Epoch 183/300\n", "0s - loss: 0.1386 - binary_accuracy: 1.0000\n", "Epoch 184/300\n", "0s - loss: 0.1379 - binary_accuracy: 1.0000\n", "Epoch 185/300\n", "0s - loss: 0.1372 - binary_accuracy: 1.0000\n", "Epoch 186/300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "0s - loss: 0.1365 - binary_accuracy: 1.0000\n", "Epoch 187/300\n", "0s - loss: 0.1358 - binary_accuracy: 1.0000\n", "Epoch 188/300\n", "0s - loss: 0.1352 - binary_accuracy: 1.0000\n", "Epoch 189/300\n", "0s - loss: 0.1345 - binary_accuracy: 1.0000\n", "Epoch 190/300\n", "0s - loss: 0.1338 - binary_accuracy: 1.0000\n", "Epoch 191/300\n", "0s - loss: 0.1332 - binary_accuracy: 1.0000\n", "Epoch 192/300\n", "0s - loss: 0.1325 - binary_accuracy: 1.0000\n", "Epoch 193/300\n", "0s - loss: 0.1318 - binary_accuracy: 1.0000\n", "Epoch 194/300\n", "0s - loss: 0.1311 - binary_accuracy: 1.0000\n", "Epoch 195/300\n", "0s - loss: 0.1305 - binary_accuracy: 1.0000\n", "Epoch 196/300\n", "0s - loss: 0.1298 - binary_accuracy: 1.0000\n", "Epoch 197/300\n", "0s - loss: 0.1291 - binary_accuracy: 1.0000\n", "Epoch 198/300\n", "0s - loss: 0.1285 - binary_accuracy: 1.0000\n", "Epoch 199/300\n", "0s - loss: 0.1278 - binary_accuracy: 1.0000\n", "Epoch 200/300\n", "0s - loss: 0.1271 - binary_accuracy: 1.0000\n", "Epoch 201/300\n", "0s - loss: 0.1265 - binary_accuracy: 1.0000\n", "Epoch 202/300\n", "0s - loss: 0.1258 - binary_accuracy: 1.0000\n", "Epoch 203/300\n", "0s - loss: 0.1252 - binary_accuracy: 1.0000\n", "Epoch 204/300\n", "0s - loss: 0.1245 - binary_accuracy: 1.0000\n", "Epoch 205/300\n", "0s - loss: 0.1238 - binary_accuracy: 1.0000\n", "Epoch 206/300\n", "0s - loss: 0.1232 - binary_accuracy: 1.0000\n", "Epoch 207/300\n", "0s - loss: 0.1226 - binary_accuracy: 1.0000\n", "Epoch 208/300\n", "0s - loss: 0.1219 - binary_accuracy: 1.0000\n", "Epoch 209/300\n", "0s - loss: 0.1212 - binary_accuracy: 1.0000\n", "Epoch 210/300\n", "0s - loss: 0.1206 - binary_accuracy: 1.0000\n", "Epoch 211/300\n", "0s - loss: 0.1199 - binary_accuracy: 1.0000\n", "Epoch 212/300\n", "0s - loss: 0.1193 - binary_accuracy: 1.0000\n", "Epoch 213/300\n", "0s - loss: 0.1186 - binary_accuracy: 1.0000\n", "Epoch 214/300\n", "0s - loss: 0.1180 - binary_accuracy: 1.0000\n", "Epoch 215/300\n", "0s - loss: 0.1173 - binary_accuracy: 1.0000\n", "Epoch 216/300\n", "0s - loss: 0.1167 - binary_accuracy: 1.0000\n", "Epoch 217/300\n", "0s - loss: 0.1161 - binary_accuracy: 1.0000\n", "Epoch 218/300\n", "0s - loss: 0.1154 - binary_accuracy: 1.0000\n", "Epoch 219/300\n", "0s - loss: 0.1148 - binary_accuracy: 1.0000\n", "Epoch 220/300\n", "0s - loss: 0.1142 - binary_accuracy: 1.0000\n", "Epoch 221/300\n", "0s - loss: 0.1135 - binary_accuracy: 1.0000\n", "Epoch 222/300\n", "0s - loss: 0.1129 - binary_accuracy: 1.0000\n", "Epoch 223/300\n", "0s - loss: 0.1123 - binary_accuracy: 1.0000\n", "Epoch 224/300\n", "0s - loss: 0.1116 - binary_accuracy: 1.0000\n", "Epoch 225/300\n", "0s - loss: 0.1110 - binary_accuracy: 1.0000\n", "Epoch 226/300\n", "0s - loss: 0.1104 - binary_accuracy: 1.0000\n", "Epoch 227/300\n", "0s - loss: 0.1098 - binary_accuracy: 1.0000\n", "Epoch 228/300\n", "0s - loss: 0.1091 - binary_accuracy: 1.0000\n", "Epoch 229/300\n", "0s - loss: 0.1085 - binary_accuracy: 1.0000\n", "Epoch 230/300\n", "0s - loss: 0.1079 - binary_accuracy: 1.0000\n", "Epoch 231/300\n", "0s - loss: 0.1073 - binary_accuracy: 1.0000\n", "Epoch 232/300\n", "0s - loss: 0.1067 - binary_accuracy: 1.0000\n", "Epoch 233/300\n", "0s - loss: 0.1061 - binary_accuracy: 1.0000\n", "Epoch 234/300\n", "0s - loss: 0.1055 - binary_accuracy: 1.0000\n", "Epoch 235/300\n", "0s - loss: 0.1049 - binary_accuracy: 1.0000\n", "Epoch 236/300\n", "0s - loss: 0.1043 - binary_accuracy: 1.0000\n", "Epoch 237/300\n", "0s - loss: 0.1037 - binary_accuracy: 1.0000\n", "Epoch 238/300\n", "0s - loss: 0.1031 - binary_accuracy: 1.0000\n", "Epoch 239/300\n", "0s - loss: 0.1025 - binary_accuracy: 1.0000\n", "Epoch 240/300\n", "0s - loss: 0.1019 - binary_accuracy: 1.0000\n", "Epoch 241/300\n", "0s - loss: 0.1013 - binary_accuracy: 1.0000\n", "Epoch 242/300\n", "0s - loss: 0.1007 - binary_accuracy: 1.0000\n", "Epoch 243/300\n", "0s - loss: 0.1001 - binary_accuracy: 1.0000\n", "Epoch 244/300\n", "0s - loss: 0.0995 - binary_accuracy: 1.0000\n", "Epoch 245/300\n", "0s - loss: 0.0989 - binary_accuracy: 1.0000\n", "Epoch 246/300\n", "0s - loss: 0.0984 - binary_accuracy: 1.0000\n", "Epoch 247/300\n", "0s - loss: 0.0978 - binary_accuracy: 1.0000\n", "Epoch 248/300\n", "0s - loss: 0.0972 - binary_accuracy: 1.0000\n", "Epoch 249/300\n", "0s - loss: 0.0966 - binary_accuracy: 1.0000\n", "Epoch 250/300\n", "0s - loss: 0.0961 - binary_accuracy: 1.0000\n", "Epoch 251/300\n", "0s - loss: 0.0955 - binary_accuracy: 1.0000\n", "Epoch 252/300\n", "0s - loss: 0.0949 - binary_accuracy: 1.0000\n", "Epoch 253/300\n", "0s - loss: 0.0944 - binary_accuracy: 1.0000\n", "Epoch 254/300\n", "0s - loss: 0.0938 - binary_accuracy: 1.0000\n", "Epoch 255/300\n", "0s - loss: 0.0932 - binary_accuracy: 1.0000\n", "Epoch 256/300\n", "0s - loss: 0.0927 - binary_accuracy: 1.0000\n", "Epoch 257/300\n", "0s - loss: 0.0921 - binary_accuracy: 1.0000\n", "Epoch 258/300\n", "0s - loss: 0.0916 - binary_accuracy: 1.0000\n", "Epoch 259/300\n", "0s - loss: 0.0910 - binary_accuracy: 1.0000\n", "Epoch 260/300\n", "0s - loss: 0.0905 - binary_accuracy: 1.0000\n", "Epoch 261/300\n", "0s - loss: 0.0899 - binary_accuracy: 1.0000\n", "Epoch 262/300\n", "0s - loss: 0.0894 - binary_accuracy: 1.0000\n", "Epoch 263/300\n", "0s - loss: 0.0888 - binary_accuracy: 1.0000\n", "Epoch 264/300\n", "0s - loss: 0.0883 - binary_accuracy: 1.0000\n", "Epoch 265/300\n", "0s - loss: 0.0878 - binary_accuracy: 1.0000\n", "Epoch 266/300\n", "0s - loss: 0.0872 - binary_accuracy: 1.0000\n", "Epoch 267/300\n", "0s - loss: 0.0867 - binary_accuracy: 1.0000\n", "Epoch 268/300\n", "0s - loss: 0.0862 - binary_accuracy: 1.0000\n", "Epoch 269/300\n", "0s - loss: 0.0857 - binary_accuracy: 1.0000\n", "Epoch 270/300\n", "0s - loss: 0.0851 - binary_accuracy: 1.0000\n", "Epoch 271/300\n", "0s - loss: 0.0846 - binary_accuracy: 1.0000\n", "Epoch 272/300\n", "0s - loss: 0.0841 - binary_accuracy: 1.0000\n", "Epoch 273/300\n", "0s - loss: 0.0836 - binary_accuracy: 1.0000\n", "Epoch 274/300\n", "0s - loss: 0.0831 - binary_accuracy: 1.0000\n", "Epoch 275/300\n", "0s - loss: 0.0826 - binary_accuracy: 1.0000\n", "Epoch 276/300\n", "0s - loss: 0.0821 - binary_accuracy: 1.0000\n", "Epoch 277/300\n", "0s - loss: 0.0815 - binary_accuracy: 1.0000\n", "Epoch 278/300\n", "0s - loss: 0.0810 - binary_accuracy: 1.0000\n", "Epoch 279/300\n", "0s - loss: 0.0805 - binary_accuracy: 1.0000\n", "Epoch 280/300\n", "0s - loss: 0.0800 - binary_accuracy: 1.0000\n", "Epoch 281/300\n", "0s - loss: 0.0795 - binary_accuracy: 1.0000\n", "Epoch 282/300\n", "0s - loss: 0.0791 - binary_accuracy: 1.0000\n", "Epoch 283/300\n", "0s - loss: 0.0786 - binary_accuracy: 1.0000\n", "Epoch 284/300\n", "0s - loss: 0.0781 - binary_accuracy: 1.0000\n", "Epoch 285/300\n", "0s - loss: 0.0776 - binary_accuracy: 1.0000\n", "Epoch 286/300\n", "0s - loss: 0.0771 - binary_accuracy: 1.0000\n", "Epoch 287/300\n", "0s - loss: 0.0766 - binary_accuracy: 1.0000\n", "Epoch 288/300\n", "0s - loss: 0.0762 - binary_accuracy: 1.0000\n", "Epoch 289/300\n", "0s - loss: 0.0757 - binary_accuracy: 1.0000\n", "Epoch 290/300\n", "0s - loss: 0.0752 - binary_accuracy: 1.0000\n", "Epoch 291/300\n", "0s - loss: 0.0747 - binary_accuracy: 1.0000\n", "Epoch 292/300\n", "0s - loss: 0.0743 - binary_accuracy: 1.0000\n", "Epoch 293/300\n", "0s - loss: 0.0738 - binary_accuracy: 1.0000\n", "Epoch 294/300\n", "0s - loss: 0.0733 - binary_accuracy: 1.0000\n", "Epoch 295/300\n", "0s - loss: 0.0729 - binary_accuracy: 1.0000\n", "Epoch 296/300\n", "0s - loss: 0.0724 - binary_accuracy: 1.0000\n", "Epoch 297/300\n", "0s - loss: 0.0720 - binary_accuracy: 1.0000\n", "Epoch 298/300\n", "0s - loss: 0.0715 - binary_accuracy: 1.0000\n", "Epoch 299/300\n", "0s - loss: 0.0711 - binary_accuracy: 1.0000\n", "Epoch 300/300\n", "0s - loss: 0.0706 - binary_accuracy: 1.0000\n", "\n", "Input after training:\n", " [[ 0. 0.]\n", " [ 0. 1.]\n", " [ 1. 0.]\n", " [ 1. 1.]]\n", "\n", "Prediction:\n", " [[ 0.]\n", " [ 1.]\n", " [ 1.]\n", " [ 0.]]\n" ] } ], "source": [ "model4 = Sequential()\n", "\n", "# add Dense Layers to model\n", "model4.add(Dense(64, input_dim = 2, activation = 'relu'))\n", "model4.add(Dense(1, activation = 'sigmoid'))\n", "\n", "model4.compile(loss = 'mean_squared_error', # the objective that the model will try to minimize\n", " optimizer = 'adam', # optimizer finds the right adjustments for the weights\n", " metrics = ['binary_accuracy']) # mectric to judge the performance of the model\n", "model4.fit(training_data, target_data, epochs=300, verbose=2)\n", "\n", "print(\"\\nInput after training:\\n\", training_data)\n", "print(\"\\nPrediction:\\n\", model2.predict(training_data).round())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice how we are able to play and figure out lots of interesting details once we start looking at the right metrics?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Data Classification" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYkAAAFqCAYAAADvDaaRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAF+RJREFUeJzt3X9sVeXhx/HPaUsLpZVeXBHjAoKlVGKUQtzCQicEK7BI\nQNPNWgMSlmYjTVHkp2xjTpxS1yVsXbBSZhhVKP5KYIthG7jxS8aQXdhYbQcbIJRfBU6iLQJt7/n+\nQdov3337SC+3p+f2nPfrv/t47znP0xrfPuee3ms5juMIAIBOJHg9AQBA/CISAAAjIgEAMCISAAAj\nIgEAMCISAACjJK8nAHSHU6dOKT8/X9nZ2ZKkSCSiPn36aNasWZoxY8aXvvZXv/qVcnJy9PDDD/fE\nVIFehUjAN/r27avNmzd3PG5oaNDs2bPVr18/TZ482fi6ffv2KSsrqyemCPQ6RAK+ddddd2nevHn6\n9a9/rezsbL344ou6fPmyzp8/r5ycHK1atUrvvvuuDh8+rFdffVWJiYnKysrq9HkpKSleLwfwBO9J\nwNdycnL0r3/9S2+//bZmzJihTZs26Q9/+INOnTqlP//5z3rqqad03333afHixcrPzzc+DwgqdhLw\nNcuy1LdvXy1atEh79uxRVVWVjh8/rvPnz+vy5cv/7/ldfR4QFEQCvvaPf/xD2dnZeu6559TW1qap\nU6dqwoQJOnPmjDr72LKuPg8ICi43wbeOHTum1atXa86cOdq9e7dKSkr0rW99S5Zl6dChQ2pra5Mk\nJSYmqrW1VZK+9HlAELGTgG9cuXJF06dPlyQlJCQoJSVFzz33nCZMmKD58+erpKREAwYMUL9+/fTg\ngw/q008/lSRNnDhRZWVlamlp+dLnAUFk8VHhAAATLjcBAIyIBADAiEgAAIyIBADAiEgAAIzi5hbY\nxsbPYz5GKJQq2w7eX8cGcd1BXLMUzHX7Yc2ZmeleT+GW+WonkZSU6PUUPBHEdQdxzVIw1x3ENccT\nX0UCANC9iAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwChuPrsJAOJFRbhK\n9fZROXJkydLIUJZKc4u9npYn2EkAwA0qwlWqs4/I0fVvdnbkqM4+oopwlccz8waRAIAb1NtHoxr3\nOyIBADdo30F0ddzviAQA3MCSFdW43xEJALjByFBWVON+RyQA4AalucXKCY3o2DlYspQTGhHYu5u4\nBRYA/ktQg9AZdhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIA\nACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAAKMk\ntw7c0tKipUuXqqGhQQkJCVqxYoXuuecet04HwGcqwlWqt4/KkSNLlkaGslSaW+z1tALHtZ3Ejh07\n1NraqpqaGpWUlGjVqlVunQqAz1SEq1RnH5EjR5LkyFGdfUQV4SqPZxY8rkVi2LBhamtrUyQSUVNT\nk5KSXNu0APCZevtoVONwj2v/5U5NTVVDQ4OmTp0q27ZVWVn5pc8PhVKVlJQY83kzM9NjPkZvFMR1\nB3HNUjDW3b6D6Gw8COuPJ65FYt26dRo/frwWLFigM2fO6Omnn9Zvf/tbpaSkdPp8274c8zkzM9PV\n2Ph5zMfpbYK47iCuWQrOui1ZnYbCktUr19+bw+ba5abbbrtN6enXfzADBgxQa2ur2tra3DodAB8Z\nGcqKahzucS0Ss2fP1j//+U8VFRXp6aef1vz585WamurW6QD4SGlusXJCI2TJknR9B5ETGsHdTR6w\nHMfp/OJfD+uOLWRQtuL/LYjrDuKapWCu2w9r5nITAMCXiAQAwIhIAACMiAQAwIhIAACMiAQAwIhI\nAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACM\niAQAwIhIAACMiAQAwCjJ6wkAiG8V4SrV20flyJElSyNDWSrNLfZ6Wugh7CQAGFWEq1RnH5EjR5Lk\nyFGdfUQV4SqPZ4aeQiQAGNXbR6Mah/8QCQBG7TuIro7Df4gEACNLVlTj8B8iAcBoZCgrqnH4D5EA\nYFSaW6yc0IiOnYMlSzmhEdzdFCDcAgvgSxGEYGMnAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMi\nAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAw\nIhIAACMiAQAwIhIAACMiAQAwSnLz4K+//ro+/PBDtbS06Mknn9S3v/1tN08H+FZFuEr19lE5cmTJ\n0shQlkpzi72eFgLAtZ3Evn37FA6HtXHjRlVXV+vs2bNunQrwtYpwlersI3LkSJIcOaqzj6giXOXx\nzBAEru0kdu/erezsbJWUlKipqUmLFy9261SAr9XbR6MaB7qTa5GwbVunT59WZWWlTp06pblz52rr\n1q2yLKvT54dCqUpKSoz5vJmZ6TEfozcK4rqDsub2HURn40H5GQRlnfHItUhkZGRo+PDhSk5O1vDh\nw5WSkqJLly7p9ttv7/T5tn055nNmZqarsfHzmI/T2wRx3UFasyWr01BYsgLxM/DD77o3R8619yTG\njh2rXbt2yXEcnTt3Tl988YUyMjLcOh3gWyNDWVGNA93JtZ3ExIkTtX//fhUUFMhxHC1fvlyJibFf\nTgKCpjS3mLub4BnLcZzOL3j2sO7YTvphW3orgrjuIK5ZCua6/bBmLjcBAHyJSAAAjIgEAMCISAAA\njIgEAMCISAAAjIgEAMCISAAAjIgEAMCISAAAjIgEAMCISAAAjIgEAMCISAAAjIgEAMCISAAAjIgE\nAMCISAAAjIgEAMCISAAAjIgEAMCISAAAjJK8ngDQW1SEq1RvH5UjR5YsjQxlqTS32OtpAa5iJwF0\nQUW4SnX2ETlyJEmOHNXZR1QRrvJ4ZoC7iATQBfX20ajGAb8gEkAXtO8gujoO+AWRALrAkhXVOOAX\nRALogpGhrKjGAb8gEkAXlOYWKyc0omPnYMlSTmgEdzfB97gFFugigoAgYicBAHFs6dKlOnjwoGfn\nJxIAACMuNwFAHGlqatKiRYtk27aSkpLUr18/SdKpU6f0k5/8RNeuXVNzc7NWrVqlq1ev6gc/+IEs\ny9KQIUNUVlamdevWaevWrWptbVVxcbEmT54c03yIBADEkY0bN2rMmDEqLi7Wrl279MMf/lCSdOzY\nMT3zzDO67777tGbNGu3YsUORSEQTJkzQ9773Pf3ud79Tc3OzPvjgA/385z/XgAEDtGfPnpjnw+Um\nAIgjJ0+e1AMPPCBJysvL07hx4yRJmZmZeuONN7R06VLt3btXLS0tKigo0JUrVzR79mx9/PHHSkhI\n0Isvvqhf/OIXKi0t1dWrV2OeD5EAgDgyfPhw1dbWSpK2bt2qnTt3SpJ++ctfas6cOVq5cqXuvPNO\nSdKHH36ocePG6Te/+Y369Omjv/zlL3r33Xf18ssva+3atXrttdding+XmwAgjnznO9/R0qVLtX37\ndvXp00f333+/JOmRRx7RggULNHDgQGVkZOj8+fP65je/qWXLliklJUX9+/fXgw8+qNOnT6uoqEh9\n+/ZVYWFhzPOxHMeJiw+faWz8POZjZGamd8txepsgrjuIa5aCuW4/rDkzM93rKdyym15u+vvf/94T\n8wAAxKGbXm4qLy+XbduaPn26pk+frszMzJ6YFwAgDtw0EuvXr1dDQ4M2b96s7373u7rzzjv12GOP\nadKkSerTp09PzBEA4JEu3d101113acaMGXr00Ud15MgRrV+/Xo8++qj++Mc/uj0/AICHbrqTeOed\nd7R582Y1NjZqxowZ2rBhgwYPHqxz587pscceU35+fk/MEwDggZtGYv/+/SotLdXXv/71/zN+xx13\n6Mc//rFrEwMAeO+mkXj11VeN/yzWzwQBAPSMSCSiF154QfX19UpOTtZLL72koUOH3vR1/DEdAMSh\nv9Wf17a/fqqzF5s1+Pb+evhrQzRm5KBbPt62bdt07do1bdq0SQcPHtTKlSu79BfZRAIA4szf6s+r\n+oPajsdnLjR1PL7VUBw4cEB5eXmSpNGjR+vw4cNdeh2f3QQAcWbbXz/tdHy7YbwrmpqalJaW1vE4\nMTFRra2tN30dkQCAOHP2YnPn45c6H++KtLQ0NTf/7+sjkYiSkm5+MYlIAECcGXx7/87HB3Y+3hVj\nxozp+ETZgwcPKjs7u0uvIxIAEGce/tqQTscnGca7Ij8/X8nJySosLNQrr7yi559/vkuv441rAIgz\n7W9Ob//rpzp7qVmDB/bXpBjvbmr/QqJoEQkAiENjRg6KKQrdxdVIXLx4UY8//rjeeOMN3XPPPW6e\nCgFRXhNW3QlbEUdKsKScoSEtLMz1elqAb7n2nkRLS4uWL1+uvn37unUKBEx5TVi1x68HQpIijlR7\n3FZ5TdjbiQE+5lokysrKVFhYqEGDvN8uwR/qTthRjQOInSuXm95//30NHDhQeXl5WrNmTZdeEwql\nKikpMeZz9+avCYxFENYdMXzRbsQJxvrbBWmt7YK45njhSiTee+89WZalvXv36pNPPtGSJUv02muv\nfem32tn25ZjP64fvwr0VQVl3gtV5KBKs7vmO9N4gKL/rG/lhzb05cq5cbnrrrbf05ptvqrq6Wvfe\ne6/Kysr42lPELGdoKKpxAP/foUOHNHPmzC4/n1tg0WssLMzl7iYExqGztfrTfz7SueYLuqP/VzRx\n+Df0wOBRMR2zqqpKW7ZsUb9+/br8GtcjUV1d7fYpECDtQfDDJQjA5NDZWm38++aOx2ebGjsexxKK\nIUOGqKKiQosXL+7ya/hYDgCIM3/6z0edjx/bG9NxJ0+e3KUP9bsRkQCAOHOu+UKn4+ebOh93E5EA\ngDhzR/+vdDo+KK3zcTcRCQCIMxOHf6Pz8WHjengm3N0EAHGn/c3pPx3bq/NNFzQo7SuaOGxczHc3\nSdJXv/pVvf32211+PpEAgDj0wOBR3RKFWHG5CQBgRCQAAEZEAgBgRCQAAEZEAgBgRCQAAEZEAgBg\nRCQAAEZEAgBgRCQAAEZEAgBgRCQAAEZEAgBgRCQAAEZEAgBgRCQAAEZ86RBuSXlNWHUnbEUcKcGS\ncoaGtLAw1+tpAehm7CQQtfKasGqPXw+EJEUcqfa4rfKasLcTA9DtiASiVnfCjmocQO9FJBC19h1E\nV8cB9F5EAlFLsKIbB9B7EQlELWdoKKpxAL0XkUDUFhbmatTdoY6dQ4Iljbqbu5sAP+IWWNwSggAE\nAzsJAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJ\nAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJAIARkQAAGBEJAIBRkhsHbWlp0bJl\ny9TQ0KBr165p7ty5mjRpkhunCrTymrDqTtiKOFKCJeUMDWlhYa7X0wLgI67sJLZs2aKMjAxt2LBB\na9eu1YoVK9w4TaCV14RVe/x6ICQp4ki1x22V14S9nRgAX3FlJzFlyhRNnjxZkuQ4jhITE904TaDV\nnbCjGgeAW+FKJPr37y9Jampq0rx58/Tss8/e9DWhUKqSkmKPSWZmeszH6A3adxCdjQflZxCUdf63\nIK47iGuOF65EQpLOnDmjkpISFRUVadq0aTd9vm1fjvmcmZnpamz8PObj9AYJVuehSLAUiJ9BkH7X\nNwriuv2w5t4cOVfek7hw4YLmzJmjRYsWqaCgwI1TBF7O0FBU4wBwK1yJRGVlpT777DOtXr1aM2fO\n1MyZM3XlyhU3ThVYCwtzNerukBKs648TLGnU3dzdBKB7WY7jGK5u96zu2E76YVt6K4K47iCuWQrm\nuv2wZi43AQB8iUgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgA\nAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIySvJ5Ab1deE1bd\nCVsRR0qwpJyhIS0szPV6WgDQLdhJxKC8Jqza49cDIUkRR6o9bqu8JuztxACgmxCJGNSdsKMaB4De\nhkjEoH0H0dVxAOhtiEQMEqzoxgGgtyESMcgZGopqHAB6GyIRg4WFuRp1d6hj55BgSaPu5u4mAP7B\nLbAxIggA/IydBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADA\niEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAKMmt\nA0ciEb3wwguqr69XcnKyXnrpJQ0dOtSVc5XXhFV3wlbEkRIsKWdoSAsLc105FwAEiWs7iW3btuna\ntWvatGmTFixYoJUrV7pynvKasGqPXw+EJEUcqfa4rfKasCvnA4AgcS0SBw4cUF5eniRp9OjROnz4\nsCvnqTthRzUOAOg61y43NTU1KS0treNxYmKiWltblZTU+SlDoVQlJSVGfZ72HURn45mZ6VEfr7cK\n0lrbBXHNUjDXHcQ1xwvXIpGWlqbm5uaOx5FIxBgISbLty7d0ngSr81AkWFJj4+e3dMzeJjMzPTBr\nbRfENUvBXLcf1tybI+fa5aYxY8Zo586dkqSDBw8qOzvblfPkDA1FNQ4A6DrXIpGfn6/k5GQVFhbq\nlVde0fPPP+/KeRYW5mrU3SElWNcfJ1jSqLu5uwkAuoPlOI7hqn7P6o7tpB+2pbciiOsO4pqlYK7b\nD2vmchMAwJeIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIyIBADAiEgAAIzi5i+uAQDxh50EAMCI\nSAAAjIgEAMCISAAAjIgEAMCISAAAjHwRiUgkouXLl+uJJ57QzJkzdeLECa+n5LqWlhYtWrRIRUVF\nKigo0Pbt272eUo+6ePGiHnroIf373//2eio94vXXX9cTTzyhxx9/XO+8847X0+kRLS0tWrBggQoL\nC1VUVBSY33W88UUktm3bpmvXrmnTpk1asGCBVq5c6fWUXLdlyxZlZGRow4YNWrt2rVasWOH1lHpM\nS0uLli9frr59+3o9lR6xb98+hcNhbdy4UdXV1Tp79qzXU+oRO3bsUGtrq2pqalRSUqJVq1Z5PaVA\n8kUkDhw4oLy8PEnS6NGjdfjwYY9n5L4pU6bomWeekSQ5jqPExESPZ9RzysrKVFhYqEGDBnk9lR6x\ne/duZWdnq6SkRN///vc1YcIEr6fUI4YNG6a2tjZFIhE1NTUpKSnJ6ykFki9+6k1NTUpLS+t4nJiY\nqNbWVl//S9W/f39J19c+b948Pfvssx7PqGe8//77GjhwoPLy8rRmzRqvp9MjbNvW6dOnVVlZqVOn\nTmnu3LnaunWrLMvyemquSk1NVUNDg6ZOnSrbtlVZWen1lALJFzuJtLQ0NTc3dzyORCK+DkS7M2fO\naNasWZo+fbqmTZvm9XR6xHvvvaePPvpIM2fO1CeffKIlS5aosbHR62m5KiMjQ+PHj1dycrKGDx+u\nlJQUXbp0yetpuW7dunUaP368fv/732vz5s1aunSprl696vW0AscXkRgzZox27twpSTp48KCys7M9\nnpH7Lly4oDlz5mjRokUqKCjwejo95q233tKbb76p6upq3XvvvSorK1NmZqbX03LV2LFjtWvXLjmO\no3PnzumLL75QRkaG19Ny3W233ab09HRJ0oABA9Ta2qq2tjaPZxU8vvjf7fz8fO3Zs0eFhYVyHEcv\nv/yy11NyXWVlpT777DOtXr1aq1evliRVVVUF5s3cIJk4caL279+vgoICOY6j5cuXB+I9qNmzZ2vZ\nsmUqKipSS0uL5s+fr9TUVK+nFTh8CiwAwMgXl5sAAO4gEgAAIyIBADAiEgAAIyIBADAiEgAAIyIB\nADAiEgiM9evX66mnnpLjOPr444/1yCOPqKmpyetpAXGNP6ZDYDiOo1mzZmnKlCmqrq7WT3/6U40d\nO9braQFxjUggUE6ePKlp06bpySef1JIlS7yeDhD3uNyEQDl9+rTS0tJUW1sr/v8IuDkigcBobm7W\nj370I61evVr9+vXThg0bvJ4SEPeIBALjZz/7mR566CHdf//9HbE4efKk19MC4hrvSQAAjNhJAACM\niAQAwIhIAACMiAQAwIhIAACMiAQAwIhIAACMiAQAwOh/ABD+xhLx1ojiAAAAAElFTkSuQmCC\n", "text/plain": [ "

\n", "To this day is it still considered to be an excellent vision model, although it has been somewhat outperformed by more revent advances." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We want to display what sort of input (they're not unique) maximizes each filter in each layer, giving us a neat visualization of the convnet's modular-hierarchical decomposition of its visual space." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Model loaded.\n", "_________________________________________________________________\n", "Layer (type) Output Shape Param # \n", "=================================================================\n", "input_3 (InputLayer) (None, None, None, 3) 0 \n", "_________________________________________________________________\n", "block1_conv1 (Conv2D) (None, None, None, 64) 1792 \n", "_________________________________________________________________\n", "block1_conv2 (Conv2D) (None, None, None, 64) 36928 \n", "_________________________________________________________________\n", "block1_pool (MaxPooling2D) (None, None, None, 64) 0 \n", "_________________________________________________________________\n", "block2_conv1 (Conv2D) (None, None, None, 128) 73856 \n", "_________________________________________________________________\n", "block2_conv2 (Conv2D) (None, None, None, 128) 147584 \n", "_________________________________________________________________\n", "block2_pool (MaxPooling2D) (None, None, None, 128) 0 \n", "_________________________________________________________________\n", "block3_conv1 (Conv2D) (None, None, None, 256) 295168 \n", "_________________________________________________________________\n", "block3_conv2 (Conv2D) (None, None, None, 256) 590080 \n", "_________________________________________________________________\n", "block3_conv3 (Conv2D) (None, None, None, 256) 590080 \n", "_________________________________________________________________\n", "block3_pool (MaxPooling2D) (None, None, None, 256) 0 \n", "_________________________________________________________________\n", "block4_conv1 (Conv2D) (None, None, None, 512) 1180160 \n", "_________________________________________________________________\n", "block4_conv2 (Conv2D) (None, None, None, 512) 2359808 \n", "_________________________________________________________________\n", "block4_conv3 (Conv2D) (None, None, None, 512) 2359808 \n", "_________________________________________________________________\n", "block4_pool (MaxPooling2D) (None, None, None, 512) 0 \n", "_________________________________________________________________\n", "block5_conv1 (Conv2D) (None, None, None, 512) 2359808 \n", "_________________________________________________________________\n", "block5_conv2 (Conv2D) (None, None, None, 512) 2359808 \n", "_________________________________________________________________\n", "block5_conv3 (Conv2D) (None, None, None, 512) 2359808 \n", "_________________________________________________________________\n", "block5_pool (MaxPooling2D) (None, None, None, 512) 0 \n", "=================================================================\n", "Total params: 14,714,688.0\n", "Trainable params: 14,714,688.0\n", "Non-trainable params: 0.0\n", "_________________________________________________________________\n", "Processing filter 0\n", "Current loss value: 12.0653\n", "Current loss value: 27.6922\n", "Current loss value: 43.5864\n", "Current loss value: 64.7105\n", "Current loss value: 90.5517\n", "Current loss value: 117.57\n", "Current loss value: 135.959\n", "Current loss value: 160.824\n", "Current loss value: 180.185\n", "Current loss value: 203.131\n", "Current loss value: 230.273\n", "Current loss value: 257.409\n", "Current loss value: 283.377\n", "Current loss value: 308.575\n", "Current loss value: 326.87\n", "Current loss value: 354.618\n", "Current loss value: 373.482\n", "Current loss value: 398.598\n", "Current loss value: 421.336\n", "Current loss value: 440.72\n", "Filter 0 processed in 12s\n", "Processing filter 1\n", "Current loss value: 0.054846\n", "Current loss value: 11.4389\n", "Current loss value: 35.2914\n", "Current loss value: 70.8651\n", "Current loss value: 104.926\n", "Current loss value: 136.879\n", "Current loss value: 169.367\n", "Current loss value: 225.64\n", "Current loss value: 285.739\n", "Current loss value: 361.342\n", "Current loss value: 440.555\n", "Current loss value: 527.626\n", "Current loss value: 608.931\n", "Current loss value: 700.198\n", "Current loss value: 784.636\n", "Current loss value: 869.762\n", "Current loss value: 959.104\n", "Current loss value: 1050.47\n", "Current loss value: 1138.34\n", "Current loss value: 1225.35\n", "Filter 1 processed in 11s\n", "Processing filter 2\n", "Current loss value: 8.7815\n", "Current loss value: 24.0476\n", "Current loss value: 55.2829\n", "Current loss value: 93.2559\n", "Current loss value: 129.784\n", "Current loss value: 159.297\n", "Current loss value: 199.508\n", "Current loss value: 225.587\n", "Current loss value: 259.298\n", "Current loss value: 282.967\n", "Current loss value: 320.44\n", "Current loss value: 348.477\n", "Current loss value: 380.775\n", "Current loss value: 409.022\n", "Current loss value: 441.056\n", "Current loss value: 465.926\n", "Current loss value: 497.068\n", "Current loss value: 525.693\n", "Current loss value: 556.092\n", "Current loss value: 583.95\n", "Filter 2 processed in 12s\n", "Processing filter 3\n", "Current loss value: 12.2802\n", "Current loss value: 46.2265\n", "Current loss value: 121.172\n", "Current loss value: 214.967\n", "Current loss value: 292.272\n", "Current loss value: 361.639\n", "Current loss value: 420.678\n", "Current loss value: 478.762\n", "Current loss value: 537.095\n", "Current loss value: 589.936\n", "Current loss value: 642.039\n", "Current loss value: 689.01\n", "Current loss value: 740.328\n", "Current loss value: 787.11\n", "Current loss value: 836.365\n", "Current loss value: 878.922\n", "Current loss value: 925.988\n", "Current loss value: 972.086\n", "Current loss value: 1018.96\n", "Current loss value: 1062.44\n", "Filter 3 processed in 12s\n", "Processing filter 4\n", "Current loss value: 7.50535\n", "Current loss value: 42.0962\n", "Current loss value: 83.0629\n", "Current loss value: 125.395\n", "Current loss value: 168.366\n", "Current loss value: 211.246\n", "Current loss value: 253.029\n", "Current loss value: 297.205\n", "Current loss value: 340.937\n", "Current loss value: 385.14\n", "Current loss value: 434.481\n", "Current loss value: 482.868\n", "Current loss value: 526.604\n", "Current loss value: 566.232\n", "Current loss value: 613.278\n", "Current loss value: 654.529\n", "Current loss value: 698.07\n", "Current loss value: 735.11\n", "Current loss value: 779.76\n", "Current loss value: 823.956\n", "Filter 4 processed in 11s\n", "Processing filter 5\n", "Current loss value: 26.0109\n", "Current loss value: 45.7031\n", "Current loss value: 75.9512\n", "Current loss value: 112.978\n", "Current loss value: 150.645\n", "Current loss value: 196.963\n", "Current loss value: 241.013\n", "Current loss value: 287.311\n", "Current loss value: 332.728\n", "Current loss value: 382.202\n", "Current loss value: 432.15\n", "Current loss value: 477.348\n", "Current loss value: 520.549\n", "Current loss value: 563.786\n", "Current loss value: 603.081\n", "Current loss value: 642.775\n", "Current loss value: 683.908\n", "Current loss value: 721.548\n", "Current loss value: 757.698\n", "Current loss value: 797.728\n", "Filter 5 processed in 11s\n", "Processing filter 6\n", "Current loss value: 0.0670014\n", "Current loss value: 7.16004\n", "Current loss value: 29.9658\n", "Current loss value: 51.4939\n", "Current loss value: 74.0536\n", "Current loss value: 98.7898\n", "Current loss value: 117.584\n", "Current loss value: 152.618\n", "Current loss value: 188.454\n", "Current loss value: 229.876\n", "Current loss value: 263.566\n", "Current loss value: 303.163\n", "Current loss value: 353.104\n", "Current loss value: 401.39\n", "Current loss value: 438.023\n", "Current loss value: 478.654\n", "Current loss value: 524.682\n", "Current loss value: 561.065\n", "Current loss value: 603.75\n", "Current loss value: 639.831\n", "Filter 6 processed in 12s\n", "Processing filter 7\n", "Current loss value: 4.5464\n", "Current loss value: 12.2474\n", "Current loss value: 27.0427\n", "Current loss value: 56.8314\n", "Current loss value: 106.635\n", "Current loss value: 152.487\n", "Current loss value: 206.929\n", "Current loss value: 260.204\n", "Current loss value: 311.328\n", "Current loss value: 365.716\n", "Current loss value: 416.819\n", "Current loss value: 465.877\n", "Current loss value: 522.064\n", "Current loss value: 581.23\n", "Current loss value: 640.849\n", "Current loss value: 702.734\n", "Current loss value: 773.033\n", "Current loss value: 841.491\n", "Current loss value: 912.669\n", "Current loss value: 982.22\n", "Filter 7 processed in 13s\n", "Processing filter 8\n", "Current loss value: 99.275\n", "Current loss value: 157.454\n", "Current loss value: 218.57\n", "Current loss value: 295.79\n", "Current loss value: 395.654\n", "Current loss value: 483.419\n", "Current loss value: 566.477\n", "Current loss value: 645.18\n", "Current loss value: 719.005\n", "Current loss value: 793.408\n", "Current loss value: 863.778\n", "Current loss value: 929.088\n", "Current loss value: 990.485\n", "Current loss value: 1053.07\n", "Current loss value: 1109.66\n", "Current loss value: 1169.93\n", "Current loss value: 1226.49\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 1281.94\n", "Current loss value: 1337.66\n", "Current loss value: 1394.93\n", "Filter 8 processed in 13s\n", "Processing filter 9\n", "Current loss value: 13.6833\n", "Current loss value: 55.8288\n", "Current loss value: 86.0705\n", "Current loss value: 122.628\n", "Current loss value: 168.689\n", "Current loss value: 229.909\n", "Current loss value: 279.108\n", "Current loss value: 329.48\n", "Current loss value: 403.989\n", "Current loss value: 462.027\n", "Current loss value: 506.369\n", "Current loss value: 554.33\n", "Current loss value: 590.236\n", "Current loss value: 644.05\n", "Current loss value: 684.668\n", "Current loss value: 736.1\n", "Current loss value: 769.723\n", "Current loss value: 814.091\n", "Current loss value: 857.361\n", "Current loss value: 897.9\n", "Filter 9 processed in 13s\n", "Processing filter 10\n", "Current loss value: 8.86461\n", "Current loss value: 13.4067\n", "Current loss value: 34.6053\n", "Current loss value: 63.0463\n", "Current loss value: 103.868\n", "Current loss value: 148.514\n", "Current loss value: 175.742\n", "Current loss value: 213.8\n", "Current loss value: 264.353\n", "Current loss value: 295.757\n", "Current loss value: 337.277\n", "Current loss value: 373.784\n", "Current loss value: 413.435\n", "Current loss value: 456.402\n", "Current loss value: 486.728\n", "Current loss value: 531.749\n", "Current loss value: 564.796\n", "Current loss value: 603.997\n", "Current loss value: 643.638\n", "Current loss value: 674.153\n", "Filter 10 processed in 13s\n", "Processing filter 11\n", "Current loss value: 5.79534\n", "Current loss value: 15.5635\n", "Current loss value: 80.6594\n", "Current loss value: 174.74\n", "Current loss value: 269.242\n", "Current loss value: 344.669\n", "Current loss value: 427.23\n", "Current loss value: 492.813\n", "Current loss value: 569.635\n", "Current loss value: 643.919\n", "Current loss value: 707.34\n", "Current loss value: 767.506\n", "Current loss value: 831.009\n", "Current loss value: 891.361\n", "Current loss value: 950.528\n", "Current loss value: 1014.21\n", "Current loss value: 1071.28\n", "Current loss value: 1130.2\n", "Current loss value: 1186.08\n", "Current loss value: 1239.83\n", "Filter 11 processed in 13s\n", "Processing filter 12\n", "Current loss value: 0.0\n", "Filter 12 processed in 1s\n", "Processing filter 13\n", "Current loss value: 10.5346\n", "Current loss value: 67.9013\n", "Current loss value: 113.977\n", "Current loss value: 147.437\n", "Current loss value: 183.78\n", "Current loss value: 226.534\n", "Current loss value: 260.352\n", "Current loss value: 305.025\n", "Current loss value: 330.475\n", "Current loss value: 364.802\n", "Current loss value: 394.377\n", "Current loss value: 420.785\n", "Current loss value: 448.997\n", "Current loss value: 474.492\n", "Current loss value: 500.347\n", "Current loss value: 525.903\n", "Current loss value: 550.478\n", "Current loss value: 573.823\n", "Current loss value: 598.337\n", "Current loss value: 622.764\n", "Filter 13 processed in 13s\n", "Processing filter 14\n", "Current loss value: 4.99275\n", "Current loss value: 28.5831\n", "Current loss value: 89.9491\n", "Current loss value: 160.362\n", "Current loss value: 218.718\n", "Current loss value: 286.955\n", "Current loss value: 347.315\n", "Current loss value: 409.969\n", "Current loss value: 472.943\n", "Current loss value: 522.258\n", "Current loss value: 569.221\n", "Current loss value: 624.792\n", "Current loss value: 664.809\n", "Current loss value: 721.781\n", "Current loss value: 768.569\n", "Current loss value: 810.742\n", "Current loss value: 861.686\n", "Current loss value: 908.474\n", "Current loss value: 953.865\n", "Current loss value: 995.324\n", "Filter 14 processed in 13s\n", "Processing filter 15\n", "Current loss value: 0.306411\n", "Current loss value: 1.86151\n", "Current loss value: 10.3531\n", "Current loss value: 26.0483\n", "Current loss value: 48.2936\n", "Current loss value: 79.8571\n", "Current loss value: 112.495\n", "Current loss value: 146.335\n", "Current loss value: 180.885\n", "Current loss value: 210.67\n", "Current loss value: 252.621\n", "Current loss value: 287.222\n", "Current loss value: 324.634\n", "Current loss value: 357.448\n", "Current loss value: 388.73\n", "Current loss value: 421.324\n", "Current loss value: 452.931\n", "Current loss value: 485.757\n", "Current loss value: 515.784\n", "Current loss value: 544.507\n", "Filter 15 processed in 14s\n", "Processing filter 16\n", "Current loss value: 0.105159\n", "Current loss value: 10.4799\n", "Current loss value: 36.2844\n", "Current loss value: 66.4284\n", "Current loss value: 89.847\n", "Current loss value: 121.286\n", "Current loss value: 150.416\n", "Current loss value: 180.815\n", "Current loss value: 210.908\n", "Current loss value: 235.641\n", "Current loss value: 271.262\n", "Current loss value: 304.441\n", "Current loss value: 335.522\n", "Current loss value: 368.063\n", "Current loss value: 399.987\n", "Current loss value: 435.021\n", "Current loss value: 468.951\n", "Current loss value: 507.484\n", "Current loss value: 546.317\n", "Current loss value: 579.109\n", "Filter 16 processed in 13s\n", "Processing filter 17\n", "Current loss value: 0.915992\n", "Current loss value: 10.0036\n", "Current loss value: 28.2838\n", "Current loss value: 65.397\n", "Current loss value: 105.616\n", "Current loss value: 163.585\n", "Current loss value: 219.631\n", "Current loss value: 285.533\n", "Current loss value: 345.776\n", "Current loss value: 416.136\n", "Current loss value: 489.289\n", "Current loss value: 555.463\n", "Current loss value: 626.247\n", "Current loss value: 692.951\n", "Current loss value: 753.056\n", "Current loss value: 812.311\n", "Current loss value: 882.714\n", "Current loss value: 939.363\n", "Current loss value: 999.322\n", "Current loss value: 1058.82\n", "Filter 17 processed in 13s\n", "Processing filter 18\n", "Current loss value: 40.7101\n", "Current loss value: 190.366\n", "Current loss value: 321.413\n", "Current loss value: 425.405\n", "Current loss value: 504.511\n", "Current loss value: 568.216\n", "Current loss value: 626.893\n", "Current loss value: 676.23\n", "Current loss value: 732.981\n", "Current loss value: 776.683\n", "Current loss value: 830.66\n", "Current loss value: 873.267\n", "Current loss value: 925.384\n", "Current loss value: 962.806\n", "Current loss value: 1017.9\n", "Current loss value: 1055.96\n", "Current loss value: 1106.42\n", "Current loss value: 1141.43\n", "Current loss value: 1192.24\n", "Current loss value: 1228.47\n", "Filter 18 processed in 13s\n", "Processing filter 19\n", "Current loss value: 0.0\n", "Filter 19 processed in 1s\n", "Processing filter 20\n", "Current loss value: 25.4104\n", "Current loss value: 44.337\n", "Current loss value: 80.7482\n", "Current loss value: 108.741\n", "Current loss value: 136.966\n", "Current loss value: 173.636\n", "Current loss value: 203.518\n", "Current loss value: 235.711\n", "Current loss value: 264.092\n", "Current loss value: 298.316\n", "Current loss value: 328.585\n", "Current loss value: 360.565\n", "Current loss value: 389.621\n", "Current loss value: 421.86\n", "Current loss value: 456.829\n", "Current loss value: 487.004\n", "Current loss value: 514.922\n", "Current loss value: 547.164\n", "Current loss value: 580.996\n", "Current loss value: 610.169\n", "Filter 20 processed in 13s\n", "Processing filter 21\n", "Current loss value: 0.0\n", "Filter 21 processed in 1s\n", "Processing filter 22\n", "Current loss value: 0.0\n", "Filter 22 processed in 1s\n", "Processing filter 23\n", "Current loss value: 15.0417\n", "Current loss value: 23.0334\n", "Current loss value: 41.4121\n", "Current loss value: 80.218\n", "Current loss value: 120.721\n", "Current loss value: 173.759\n", "Current loss value: 227.316\n", "Current loss value: 274.975\n", "Current loss value: 332.868\n", "Current loss value: 383.891\n", "Current loss value: 426.161\n", "Current loss value: 474.181\n", "Current loss value: 524.153\n", "Current loss value: 576.061\n", "Current loss value: 633.459\n", "Current loss value: 679.771\n", "Current loss value: 728.024\n", "Current loss value: 769.749\n", "Current loss value: 818.912\n", "Current loss value: 862.521\n", "Filter 23 processed in 13s\n", "Processing filter 24\n", "Current loss value: 8.57865\n", "Current loss value: 39.2493\n", "Current loss value: 87.2243\n", "Current loss value: 126.905\n", "Current loss value: 169.741\n", "Current loss value: 209.231\n", "Current loss value: 248.823\n", "Current loss value: 283.103\n", "Current loss value: 318.899\n", "Current loss value: 361.031\n", "Current loss value: 400.43\n", "Current loss value: 439.277\n", "Current loss value: 471.23\n", "Current loss value: 513.332\n", "Current loss value: 556.007\n", "Current loss value: 594.859\n", "Current loss value: 634.223\n", "Current loss value: 671.762\n", "Current loss value: 706.376\n", "Current loss value: 743.192\n", "Filter 24 processed in 13s\n", "Processing filter 25\n", "Current loss value: 1.19015\n", "Current loss value: 11.9508\n", "Current loss value: 43.3563\n", "Current loss value: 75.4016\n", "Current loss value: 104.965\n", "Current loss value: 133.579\n", "Current loss value: 167.891\n", "Current loss value: 204.411\n", "Current loss value: 245.008\n", "Current loss value: 290.242\n", "Current loss value: 332.281\n", "Current loss value: 371.787\n", "Current loss value: 411.77\n", "Current loss value: 454.238\n", "Current loss value: 488.432\n", "Current loss value: 523.629\n", "Current loss value: 552.874\n", "Current loss value: 588.919\n", "Current loss value: 620.22\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 658.428\n", "Filter 25 processed in 13s\n", "Processing filter 26\n", "Current loss value: 0.0\n", "Filter 26 processed in 1s\n", "Processing filter 27\n", "Current loss value: 2.78984\n", "Current loss value: 5.11771\n", "Current loss value: 17.1885\n", "Current loss value: 33.7786\n", "Current loss value: 63.8487\n", "Current loss value: 95.7518\n", "Current loss value: 140.395\n", "Current loss value: 195.654\n", "Current loss value: 253.255\n", "Current loss value: 307.241\n", "Current loss value: 377.244\n", "Current loss value: 453.853\n", "Current loss value: 534.509\n", "Current loss value: 616.213\n", "Current loss value: 700.302\n", "Current loss value: 782.525\n", "Current loss value: 858.481\n", "Current loss value: 940.272\n", "Current loss value: 1010.97\n", "Current loss value: 1089.33\n", "Filter 27 processed in 13s\n", "Processing filter 28\n", "Current loss value: 1.06553\n", "Current loss value: 31.3072\n", "Current loss value: 104.166\n", "Current loss value: 185.806\n", "Current loss value: 252.866\n", "Current loss value: 303.9\n", "Current loss value: 345.979\n", "Current loss value: 395.899\n", "Current loss value: 434.143\n", "Current loss value: 477.2\n", "Current loss value: 513.505\n", "Current loss value: 557.399\n", "Current loss value: 595.74\n", "Current loss value: 638.028\n", "Current loss value: 674.802\n", "Current loss value: 719.473\n", "Current loss value: 754.41\n", "Current loss value: 795.416\n", "Current loss value: 833.389\n", "Current loss value: 871.017\n", "Filter 28 processed in 13s\n", "Processing filter 29\n", "Current loss value: 40.4747\n", "Current loss value: 51.9063\n", "Current loss value: 87.5849\n", "Current loss value: 159.074\n", "Current loss value: 191.236\n", "Current loss value: 245.812\n", "Current loss value: 288.684\n", "Current loss value: 353.134\n", "Current loss value: 406.811\n", "Current loss value: 452.982\n", "Current loss value: 512.859\n", "Current loss value: 558.175\n", "Current loss value: 612.613\n", "Current loss value: 669.062\n", "Current loss value: 704.484\n", "Current loss value: 765.453\n", "Current loss value: 805.903\n", "Current loss value: 850.239\n", "Current loss value: 892.95\n", "Current loss value: 947.283\n", "Filter 29 processed in 13s\n", "Processing filter 30\n", "Current loss value: 27.3185\n", "Current loss value: 58.9087\n", "Current loss value: 99.5402\n", "Current loss value: 133.971\n", "Current loss value: 173.532\n", "Current loss value: 211.238\n", "Current loss value: 247.795\n", "Current loss value: 279.767\n", "Current loss value: 318.007\n", "Current loss value: 351.07\n", "Current loss value: 388.592\n", "Current loss value: 422.103\n", "Current loss value: 458.926\n", "Current loss value: 496.081\n", "Current loss value: 529.077\n", "Current loss value: 560.094\n", "Current loss value: 594.121\n", "Current loss value: 629.25\n", "Current loss value: 660.048\n", "Current loss value: 694.545\n", "Filter 30 processed in 13s\n", "Processing filter 31\n", "Current loss value: 2.38596\n", "Current loss value: 24.322\n", "Current loss value: 62.4673\n", "Current loss value: 110.789\n", "Current loss value: 162.317\n", "Current loss value: 202.795\n", "Current loss value: 238.012\n", "Current loss value: 285.642\n", "Current loss value: 316.991\n", "Current loss value: 356.874\n", "Current loss value: 387.88\n", "Current loss value: 422.888\n", "Current loss value: 451.918\n", "Current loss value: 486.047\n", "Current loss value: 515.419\n", "Current loss value: 546.688\n", "Current loss value: 580.412\n", "Current loss value: 610.378\n", "Current loss value: 640.181\n", "Current loss value: 674.589\n", "Filter 31 processed in 13s\n", "Processing filter 32\n", "Current loss value: 11.1654\n", "Current loss value: 38.0228\n", "Current loss value: 89.1728\n", "Current loss value: 143.874\n", "Current loss value: 194.627\n", "Current loss value: 248.591\n", "Current loss value: 302.662\n", "Current loss value: 347.286\n", "Current loss value: 400.452\n", "Current loss value: 440.865\n", "Current loss value: 482.382\n", "Current loss value: 518.831\n", "Current loss value: 565.545\n", "Current loss value: 603.963\n", "Current loss value: 644.364\n", "Current loss value: 687.219\n", "Current loss value: 716.613\n", "Current loss value: 763.987\n", "Current loss value: 791.946\n", "Current loss value: 830.884\n", "Filter 32 processed in 13s\n", "Processing filter 33\n", "Current loss value: 0.050033\n", "Current loss value: 7.0219\n", "Current loss value: 31.8885\n", "Current loss value: 88.755\n", "Current loss value: 150.229\n", "Current loss value: 212.088\n", "Current loss value: 274.496\n", "Current loss value: 377.997\n", "Current loss value: 470.101\n", "Current loss value: 540.403\n", "Current loss value: 619.165\n", "Current loss value: 691.216\n", "Current loss value: 780.842\n", "Current loss value: 860.724\n", "Current loss value: 924.514\n", "Current loss value: 998.056\n", "Current loss value: 1059.02\n", "Current loss value: 1121.34\n", "Current loss value: 1194.47\n", "Current loss value: 1252.61\n", "Filter 33 processed in 13s\n", "Processing filter 34\n", "Current loss value: 5.12268\n", "Current loss value: 12.3418\n", "Current loss value: 40.0112\n", "Current loss value: 102.805\n", "Current loss value: 165.118\n", "Current loss value: 244.26\n", "Current loss value: 313.975\n", "Current loss value: 372.636\n", "Current loss value: 431.534\n", "Current loss value: 495.753\n", "Current loss value: 542.897\n", "Current loss value: 605.086\n", "Current loss value: 651.859\n", "Current loss value: 704.466\n", "Current loss value: 745.024\n", "Current loss value: 798.348\n", "Current loss value: 844.633\n", "Current loss value: 890.282\n", "Current loss value: 947.467\n", "Current loss value: 994.249\n", "Filter 34 processed in 13s\n", "Processing filter 35\n", "Current loss value: 0.0\n", "Filter 35 processed in 1s\n", "Processing filter 36\n", "Current loss value: 10.0076\n", "Current loss value: 36.8783\n", "Current loss value: 107.247\n", "Current loss value: 170.292\n", "Current loss value: 260.065\n", "Current loss value: 330.505\n", "Current loss value: 423.213\n", "Current loss value: 506.103\n", "Current loss value: 583.223\n", "Current loss value: 656.951\n", "Current loss value: 718.651\n", "Current loss value: 790.909\n", "Current loss value: 852.078\n", "Current loss value: 919.249\n", "Current loss value: 982.096\n", "Current loss value: 1042.55\n", "Current loss value: 1099.74\n", "Current loss value: 1162.94\n", "Current loss value: 1218.0\n", "Current loss value: 1279.32\n", "Filter 36 processed in 13s\n", "Processing filter 37\n", "Current loss value: 1.84829\n", "Current loss value: 36.2413\n", "Current loss value: 138.224\n", "Current loss value: 229.974\n", "Current loss value: 327.141\n", "Current loss value: 418.541\n", "Current loss value: 508.804\n", "Current loss value: 584.271\n", "Current loss value: 664.776\n", "Current loss value: 741.427\n", "Current loss value: 814.196\n", "Current loss value: 876.22\n", "Current loss value: 939.691\n", "Current loss value: 1008.92\n", "Current loss value: 1075.35\n", "Current loss value: 1139.41\n", "Current loss value: 1204.12\n", "Current loss value: 1272.73\n", "Current loss value: 1337.46\n", "Current loss value: 1401.49\n", "Filter 37 processed in 13s\n", "Processing filter 38\n", "Current loss value: 4.02634\n", "Current loss value: 12.9529\n", "Current loss value: 24.143\n", "Current loss value: 50.0027\n", "Current loss value: 72.4084\n", "Current loss value: 89.101\n", "Current loss value: 125.672\n", "Current loss value: 158.792\n", "Current loss value: 182.908\n", "Current loss value: 209.324\n", "Current loss value: 262.656\n", "Current loss value: 287.857\n", "Current loss value: 302.045\n", "Current loss value: 343.002\n", "Current loss value: 382.482\n", "Current loss value: 410.971\n", "Current loss value: 449.755\n", "Current loss value: 482.883\n", "Current loss value: 516.703\n", "Current loss value: 552.421\n", "Filter 38 processed in 13s\n", "Processing filter 39\n", "Current loss value: 0.0388419\n", "Current loss value: 3.62158\n", "Current loss value: 14.1368\n", "Current loss value: 26.0036\n", "Current loss value: 47.4864\n", "Current loss value: 73.8298\n", "Current loss value: 92.9335\n", "Current loss value: 109.407\n", "Current loss value: 130.96\n", "Current loss value: 150.321\n", "Current loss value: 169.773\n", "Current loss value: 193.438\n", "Current loss value: 214.502\n", "Current loss value: 233.711\n", "Current loss value: 255.129\n", "Current loss value: 288.689\n", "Current loss value: 320.206\n", "Current loss value: 357.966\n", "Current loss value: 394.437\n", "Current loss value: 428.311\n", "Filter 39 processed in 13s\n", "Processing filter 40\n", "Current loss value: 0.302024\n", "Current loss value: 7.87271\n", "Current loss value: 28.1896\n", "Current loss value: 71.7948\n", "Current loss value: 124.492\n", "Current loss value: 206.124\n", "Current loss value: 290.065\n", "Current loss value: 377.032\n", "Current loss value: 475.168\n", "Current loss value: 585.768\n", "Current loss value: 670.882\n", "Current loss value: 758.981\n", "Current loss value: 840.986\n", "Current loss value: 913.63\n", "Current loss value: 989.311\n", "Current loss value: 1055.24\n", "Current loss value: 1127.63\n", "Current loss value: 1188.29\n", "Current loss value: 1262.69\n", "Current loss value: 1321.7\n", "Filter 40 processed in 13s\n", "Processing filter 41\n", "Current loss value: 0.0\n", "Filter 41 processed in 1s\n", "Processing filter 42\n", "Current loss value: 6.62293\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 63.3157\n", "Current loss value: 146.465\n", "Current loss value: 260.286\n", "Current loss value: 359.543\n", "Current loss value: 447.882\n", "Current loss value: 516.227\n", "Current loss value: 583.136\n", "Current loss value: 653.47\n", "Current loss value: 709.473\n", "Current loss value: 780.778\n", "Current loss value: 835.789\n", "Current loss value: 910.41\n", "Current loss value: 969.05\n", "Current loss value: 1033.13\n", "Current loss value: 1094.93\n", "Current loss value: 1159.8\n", "Current loss value: 1218.78\n", "Current loss value: 1286.46\n", "Current loss value: 1338.95\n", "Filter 42 processed in 13s\n", "Processing filter 43\n", "Current loss value: 7.23141\n", "Current loss value: 42.1992\n", "Current loss value: 94.9333\n", "Current loss value: 130.342\n", "Current loss value: 166.87\n", "Current loss value: 201.827\n", "Current loss value: 242.448\n", "Current loss value: 274.365\n", "Current loss value: 308.419\n", "Current loss value: 338.739\n", "Current loss value: 372.006\n", "Current loss value: 402.93\n", "Current loss value: 440.272\n", "Current loss value: 472.139\n", "Current loss value: 509.557\n", "Current loss value: 544.465\n", "Current loss value: 577.755\n", "Current loss value: 614.185\n", "Current loss value: 647.091\n", "Current loss value: 682.597\n", "Filter 43 processed in 13s\n", "Processing filter 44\n", "Current loss value: 5.88651\n", "Current loss value: 16.5832\n", "Current loss value: 55.2816\n", "Current loss value: 120.36\n", "Current loss value: 172.561\n", "Current loss value: 226.23\n", "Current loss value: 269.01\n", "Current loss value: 304.307\n", "Current loss value: 335.506\n", "Current loss value: 366.457\n", "Current loss value: 401.807\n", "Current loss value: 436.76\n", "Current loss value: 467.208\n", "Current loss value: 499.616\n", "Current loss value: 530.897\n", "Current loss value: 558.887\n", "Current loss value: 594.039\n", "Current loss value: 625.794\n", "Current loss value: 657.741\n", "Current loss value: 694.535\n", "Filter 44 processed in 13s\n", "Processing filter 45\n", "Current loss value: 0.536129\n", "Current loss value: 52.2245\n", "Current loss value: 127.372\n", "Current loss value: 190.392\n", "Current loss value: 238.508\n", "Current loss value: 283.597\n", "Current loss value: 321.067\n", "Current loss value: 359.027\n", "Current loss value: 393.953\n", "Current loss value: 428.748\n", "Current loss value: 464.848\n", "Current loss value: 498.165\n", "Current loss value: 533.507\n", "Current loss value: 564.868\n", "Current loss value: 604.555\n", "Current loss value: 637.438\n", "Current loss value: 671.672\n", "Current loss value: 706.199\n", "Current loss value: 736.144\n", "Current loss value: 769.413\n", "Filter 45 processed in 14s\n", "Processing filter 46\n", "Current loss value: 1.24781\n", "Current loss value: 5.92333\n", "Current loss value: 11.6422\n", "Current loss value: 31.8195\n", "Current loss value: 71.3014\n", "Current loss value: 112.46\n", "Current loss value: 159.748\n", "Current loss value: 209.292\n", "Current loss value: 264.406\n", "Current loss value: 322.22\n", "Current loss value: 391.013\n", "Current loss value: 455.162\n", "Current loss value: 510.417\n", "Current loss value: 568.12\n", "Current loss value: 620.393\n", "Current loss value: 673.135\n", "Current loss value: 724.69\n", "Current loss value: 775.858\n", "Current loss value: 826.949\n", "Current loss value: 875.764\n", "Filter 46 processed in 13s\n", "Processing filter 47\n", "Current loss value: 0.978532\n", "Current loss value: 40.8167\n", "Current loss value: 161.127\n", "Current loss value: 261.177\n", "Current loss value: 346.613\n", "Current loss value: 416.432\n", "Current loss value: 480.148\n", "Current loss value: 543.507\n", "Current loss value: 606.154\n", "Current loss value: 659.911\n", "Current loss value: 719.529\n", "Current loss value: 764.348\n", "Current loss value: 811.302\n", "Current loss value: 865.285\n", "Current loss value: 908.945\n", "Current loss value: 961.314\n", "Current loss value: 1004.47\n", "Current loss value: 1054.86\n", "Current loss value: 1101.68\n", "Current loss value: 1152.73\n", "Filter 47 processed in 13s\n", "Processing filter 48\n", "Current loss value: 1.19798\n", "Current loss value: 29.3245\n", "Current loss value: 119.388\n", "Current loss value: 230.655\n", "Current loss value: 337.969\n", "Current loss value: 413.35\n", "Current loss value: 495.053\n", "Current loss value: 569.336\n", "Current loss value: 648.217\n", "Current loss value: 719.389\n", "Current loss value: 777.162\n", "Current loss value: 835.732\n", "Current loss value: 896.934\n", "Current loss value: 957.024\n", "Current loss value: 1010.84\n", "Current loss value: 1062.74\n", "Current loss value: 1121.11\n", "Current loss value: 1174.25\n", "Current loss value: 1227.27\n", "Current loss value: 1280.3\n", "Filter 48 processed in 13s\n", "Processing filter 49\n", "Current loss value: 0.683177\n", "Current loss value: 18.6339\n", "Current loss value: 51.0986\n", "Current loss value: 126.744\n", "Current loss value: 202.458\n", "Current loss value: 265.344\n", "Current loss value: 349.105\n", "Current loss value: 424.98\n", "Current loss value: 514.187\n", "Current loss value: 584.495\n", "Current loss value: 668.559\n", "Current loss value: 734.09\n", "Current loss value: 809.16\n", "Current loss value: 874.137\n", "Current loss value: 943.118\n", "Current loss value: 1008.79\n", "Current loss value: 1069.87\n", "Current loss value: 1137.0\n", "Current loss value: 1197.44\n", "Current loss value: 1268.56\n", "Filter 49 processed in 13s\n", "Processing filter 50\n", "Current loss value: 1.95194\n", "Current loss value: 5.81353\n", "Current loss value: 11.0168\n", "Current loss value: 22.0634\n", "Current loss value: 36.9587\n", "Current loss value: 54.4013\n", "Current loss value: 78.0761\n", "Current loss value: 102.064\n", "Current loss value: 127.349\n", "Current loss value: 149.468\n", "Current loss value: 171.52\n", "Current loss value: 196.341\n", "Current loss value: 220.683\n", "Current loss value: 243.036\n", "Current loss value: 268.876\n", "Current loss value: 297.118\n", "Current loss value: 324.715\n", "Current loss value: 356.962\n", "Current loss value: 389.466\n", "Current loss value: 426.627\n", "Filter 50 processed in 13s\n", "Processing filter 51\n", "Current loss value: 0.166452\n", "Current loss value: 12.814\n", "Current loss value: 74.6327\n", "Current loss value: 142.537\n", "Current loss value: 209.932\n", "Current loss value: 269.864\n", "Current loss value: 323.544\n", "Current loss value: 380.609\n", "Current loss value: 430.571\n", "Current loss value: 475.728\n", "Current loss value: 522.528\n", "Current loss value: 575.389\n", "Current loss value: 625.007\n", "Current loss value: 671.646\n", "Current loss value: 716.444\n", "Current loss value: 757.897\n", "Current loss value: 797.947\n", "Current loss value: 837.984\n", "Current loss value: 877.388\n", "Current loss value: 916.942\n", "Filter 51 processed in 13s\n", "Processing filter 52\n", "Current loss value: 0.0\n", "Filter 52 processed in 1s\n", "Processing filter 53\n", "Current loss value: 68.6262\n", "Current loss value: 169.689\n", "Current loss value: 307.445\n", "Current loss value: 421.048\n", "Current loss value: 497.783\n", "Current loss value: 572.029\n", "Current loss value: 643.507\n", "Current loss value: 712.488\n", "Current loss value: 769.522\n", "Current loss value: 823.03\n", "Current loss value: 878.532\n", "Current loss value: 928.057\n", "Current loss value: 981.769\n", "Current loss value: 1031.38\n", "Current loss value: 1082.33\n", "Current loss value: 1133.84\n", "Current loss value: 1181.47\n", "Current loss value: 1229.3\n", "Current loss value: 1276.96\n", "Current loss value: 1327.08\n", "Filter 53 processed in 13s\n", "Processing filter 54\n", "Current loss value: 0.0\n", "Filter 54 processed in 1s\n", "Processing filter 55\n", "Current loss value: 0.28425\n", "Current loss value: 25.6046\n", "Current loss value: 74.4269\n", "Current loss value: 120.636\n", "Current loss value: 164.475\n", "Current loss value: 202.422\n", "Current loss value: 238.794\n", "Current loss value: 275.816\n", "Current loss value: 319.477\n", "Current loss value: 365.075\n", "Current loss value: 400.308\n", "Current loss value: 439.167\n", "Current loss value: 471.954\n", "Current loss value: 507.916\n", "Current loss value: 539.843\n", "Current loss value: 573.784\n", "Current loss value: 604.515\n", "Current loss value: 634.109\n", "Current loss value: 668.72\n", "Current loss value: 698.356\n", "Filter 55 processed in 13s\n", "Processing filter 56\n", "Current loss value: 0.519568\n", "Current loss value: 7.58915\n", "Current loss value: 46.9433\n", "Current loss value: 91.7451\n", "Current loss value: 134.256\n", "Current loss value: 186.538\n", "Current loss value: 227.608\n", "Current loss value: 276.04\n", "Current loss value: 320.489\n", "Current loss value: 361.081\n", "Current loss value: 396.583\n", "Current loss value: 434.187\n", "Current loss value: 476.271\n", "Current loss value: 518.601\n", "Current loss value: 566.894\n", "Current loss value: 615.604\n", "Current loss value: 661.441\n", "Current loss value: 706.989\n", "Current loss value: 756.103\n", "Current loss value: 806.925\n", "Filter 56 processed in 13s\n", "Processing filter 57\n", "Current loss value: 0.0\n", "Filter 57 processed in 1s\n", "Processing filter 58\n", "Current loss value: 26.7923\n", "Current loss value: 60.1228\n", "Current loss value: 130.917\n", "Current loss value: 207.633\n", "Current loss value: 267.698\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 321.624\n", "Current loss value: 376.41\n", "Current loss value: 425.212\n", "Current loss value: 466.201\n", "Current loss value: 518.426\n", "Current loss value: 550.277\n", "Current loss value: 599.924\n", "Current loss value: 632.1\n", "Current loss value: 673.688\n", "Current loss value: 702.028\n", "Current loss value: 746.16\n", "Current loss value: 777.535\n", "Current loss value: 816.634\n", "Current loss value: 843.62\n", "Current loss value: 887.003\n", "Filter 58 processed in 13s\n", "Processing filter 59\n", "Current loss value: 40.6914\n", "Current loss value: 129.037\n", "Current loss value: 203.483\n", "Current loss value: 276.184\n", "Current loss value: 340.607\n", "Current loss value: 394.943\n", "Current loss value: 446.31\n", "Current loss value: 494.469\n", "Current loss value: 542.534\n", "Current loss value: 593.714\n", "Current loss value: 639.824\n", "Current loss value: 691.702\n", "Current loss value: 736.417\n", "Current loss value: 786.338\n", "Current loss value: 834.726\n", "Current loss value: 885.643\n", "Current loss value: 937.868\n", "Current loss value: 989.459\n", "Current loss value: 1043.75\n", "Current loss value: 1092.26\n", "Filter 59 processed in 13s\n", "Processing filter 60\n", "Current loss value: 0.0\n", "Filter 60 processed in 1s\n", "Processing filter 61\n", "Current loss value: 4.71198\n", "Current loss value: 16.3109\n", "Current loss value: 48.6713\n", "Current loss value: 114.221\n", "Current loss value: 189.99\n", "Current loss value: 256.723\n", "Current loss value: 332.176\n", "Current loss value: 399.409\n", "Current loss value: 458.861\n", "Current loss value: 517.301\n", "Current loss value: 568.56\n", "Current loss value: 632.079\n", "Current loss value: 685.89\n", "Current loss value: 736.311\n", "Current loss value: 793.732\n", "Current loss value: 847.956\n", "Current loss value: 899.071\n", "Current loss value: 954.318\n", "Current loss value: 1007.37\n", "Current loss value: 1057.27\n", "Filter 61 processed in 13s\n", "Processing filter 62\n", "Current loss value: 7.4218\n", "Current loss value: 76.4544\n", "Current loss value: 148.615\n", "Current loss value: 198.066\n", "Current loss value: 249.884\n", "Current loss value: 293.997\n", "Current loss value: 336.419\n", "Current loss value: 378.254\n", "Current loss value: 421.034\n", "Current loss value: 456.236\n", "Current loss value: 494.711\n", "Current loss value: 530.736\n", "Current loss value: 566.185\n", "Current loss value: 608.213\n", "Current loss value: 645.371\n", "Current loss value: 681.025\n", "Current loss value: 717.487\n", "Current loss value: 756.211\n", "Current loss value: 788.382\n", "Current loss value: 834.453\n", "Filter 62 processed in 13s\n", "Processing filter 63\n", "Current loss value: 0.0\n", "Filter 63 processed in 1s\n", "Processing filter 64\n", "Current loss value: 2.09268\n", "Current loss value: 30.1317\n", "Current loss value: 89.3021\n", "Current loss value: 165.199\n", "Current loss value: 235.539\n", "Current loss value: 327.326\n", "Current loss value: 419.783\n", "Current loss value: 541.654\n", "Current loss value: 647.662\n", "Current loss value: 742.964\n", "Current loss value: 835.716\n", "Current loss value: 922.161\n", "Current loss value: 1016.07\n", "Current loss value: 1088.49\n", "Current loss value: 1177.29\n", "Current loss value: 1256.5\n", "Current loss value: 1346.54\n", "Current loss value: 1417.32\n", "Current loss value: 1501.98\n", "Current loss value: 1579.33\n", "Filter 64 processed in 13s\n", "Processing filter 65\n", "Current loss value: 30.3816\n", "Current loss value: 64.033\n", "Current loss value: 94.4238\n", "Current loss value: 128.957\n", "Current loss value: 165.93\n", "Current loss value: 215.907\n", "Current loss value: 266.101\n", "Current loss value: 307.366\n", "Current loss value: 356.274\n", "Current loss value: 406.16\n", "Current loss value: 447.016\n", "Current loss value: 494.584\n", "Current loss value: 542.609\n", "Current loss value: 583.578\n", "Current loss value: 626.911\n", "Current loss value: 662.337\n", "Current loss value: 701.24\n", "Current loss value: 736.984\n", "Current loss value: 776.27\n", "Current loss value: 812.48\n", "Filter 65 processed in 13s\n", "Processing filter 66\n", "Current loss value: 10.1402\n", "Current loss value: 15.0312\n", "Current loss value: 27.3085\n", "Current loss value: 52.8459\n", "Current loss value: 98.3896\n", "Current loss value: 146.583\n", "Current loss value: 194.689\n", "Current loss value: 241.441\n", "Current loss value: 296.045\n", "Current loss value: 354.747\n", "Current loss value: 401.108\n", "Current loss value: 447.729\n", "Current loss value: 494.685\n", "Current loss value: 549.66\n", "Current loss value: 597.259\n", "Current loss value: 648.243\n", "Current loss value: 697.194\n", "Current loss value: 747.539\n", "Current loss value: 795.491\n", "Current loss value: 842.628\n", "Filter 66 processed in 13s\n", "Processing filter 67\n", "Current loss value: 0.0\n", "Filter 67 processed in 1s\n", "Processing filter 68\n", "Current loss value: 0.0\n", "Filter 68 processed in 1s\n", "Processing filter 69\n", "Current loss value: 0.0\n", "Filter 69 processed in 1s\n", "Processing filter 70\n", "Current loss value: 2.29804\n", "Current loss value: 15.9096\n", "Current loss value: 36.0112\n", "Current loss value: 57.2724\n", "Current loss value: 82.0419\n", "Current loss value: 114.862\n", "Current loss value: 145.309\n", "Current loss value: 178.774\n", "Current loss value: 218.716\n", "Current loss value: 253.0\n", "Current loss value: 289.749\n", "Current loss value: 324.594\n", "Current loss value: 361.353\n", "Current loss value: 404.929\n", "Current loss value: 456.536\n", "Current loss value: 511.041\n", "Current loss value: 568.222\n", "Current loss value: 619.774\n", "Current loss value: 674.308\n", "Current loss value: 730.738\n", "Filter 70 processed in 13s\n", "Processing filter 71\n", "Current loss value: 0.0\n", "Filter 71 processed in 1s\n", "Processing filter 72\n", "Current loss value: 0.0\n", "Filter 72 processed in 1s\n", "Processing filter 73\n", "Current loss value: 25.8659\n", "Current loss value: 61.9791\n", "Current loss value: 112.244\n", "Current loss value: 153.127\n", "Current loss value: 192.935\n", "Current loss value: 238.576\n", "Current loss value: 287.229\n", "Current loss value: 325.936\n", "Current loss value: 366.642\n", "Current loss value: 403.249\n", "Current loss value: 443.198\n", "Current loss value: 483.538\n", "Current loss value: 516.21\n", "Current loss value: 555.699\n", "Current loss value: 589.129\n", "Current loss value: 626.91\n", "Current loss value: 664.453\n", "Current loss value: 703.954\n", "Current loss value: 738.068\n", "Current loss value: 773.845\n", "Filter 73 processed in 13s\n", "Processing filter 74\n", "Current loss value: 2.42005\n", "Current loss value: 9.51348\n", "Current loss value: 35.0348\n", "Current loss value: 88.6146\n", "Current loss value: 136.084\n", "Current loss value: 182.138\n", "Current loss value: 229.851\n", "Current loss value: 280.262\n", "Current loss value: 328.783\n", "Current loss value: 370.572\n", "Current loss value: 412.125\n", "Current loss value: 452.91\n", "Current loss value: 498.616\n", "Current loss value: 539.264\n", "Current loss value: 579.2\n", "Current loss value: 621.885\n", "Current loss value: 660.005\n", "Current loss value: 700.874\n", "Current loss value: 741.982\n", "Current loss value: 779.012\n", "Filter 74 processed in 13s\n", "Processing filter 75\n", "Current loss value: 2.39678\n", "Current loss value: 18.9076\n", "Current loss value: 48.9022\n", "Current loss value: 86.842\n", "Current loss value: 126.977\n", "Current loss value: 164.264\n", "Current loss value: 193.246\n", "Current loss value: 229.583\n", "Current loss value: 263.795\n", "Current loss value: 295.897\n", "Current loss value: 327.674\n", "Current loss value: 354.7\n", "Current loss value: 383.899\n", "Current loss value: 415.119\n", "Current loss value: 443.739\n", "Current loss value: 471.218\n", "Current loss value: 497.553\n", "Current loss value: 526.307\n", "Current loss value: 552.673\n", "Current loss value: 583.359\n", "Filter 75 processed in 13s\n", "Processing filter 76\n", "Current loss value: 13.0109\n", "Current loss value: 58.1187\n", "Current loss value: 122.723\n", "Current loss value: 202.936\n", "Current loss value: 284.198\n", "Current loss value: 364.666\n", "Current loss value: 414.156\n", "Current loss value: 471.046\n", "Current loss value: 511.832\n", "Current loss value: 585.499\n", "Current loss value: 633.375\n", "Current loss value: 696.281\n", "Current loss value: 762.01\n", "Current loss value: 822.43\n", "Current loss value: 881.275\n", "Current loss value: 937.819\n", "Current loss value: 1000.64\n", "Current loss value: 1054.02\n", "Current loss value: 1113.19\n", "Current loss value: 1161.34\n", "Filter 76 processed in 13s\n", "Processing filter 77\n", "Current loss value: 0.0\n", "Filter 77 processed in 1s\n", "Processing filter 78\n", "Current loss value: 0.0\n", "Filter 78 processed in 1s\n", "Processing filter 79\n", "Current loss value: 2.12546\n", "Current loss value: 24.7037\n", "Current loss value: 83.4598\n", "Current loss value: 174.001\n", "Current loss value: 245.752\n", "Current loss value: 317.152\n", "Current loss value: 385.066\n", "Current loss value: 440.45\n", "Current loss value: 506.817\n", "Current loss value: 563.418\n", "Current loss value: 624.002\n", "Current loss value: 677.313\n", "Current loss value: 727.868\n", "Current loss value: 778.938\n", "Current loss value: 837.491\n", "Current loss value: 886.758\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 934.198\n", "Current loss value: 991.91\n", "Current loss value: 1036.19\n", "Current loss value: 1089.14\n", "Filter 79 processed in 13s\n", "Processing filter 80\n", "Current loss value: 3.63451\n", "Current loss value: 47.0125\n", "Current loss value: 90.8534\n", "Current loss value: 133.044\n", "Current loss value: 165.922\n", "Current loss value: 215.056\n", "Current loss value: 253.798\n", "Current loss value: 296.914\n", "Current loss value: 332.043\n", "Current loss value: 370.915\n", "Current loss value: 401.667\n", "Current loss value: 442.595\n", "Current loss value: 480.801\n", "Current loss value: 518.218\n", "Current loss value: 556.065\n", "Current loss value: 591.453\n", "Current loss value: 626.984\n", "Current loss value: 661.574\n", "Current loss value: 692.79\n", "Current loss value: 730.943\n", "Filter 80 processed in 13s\n", "Processing filter 81\n", "Current loss value: 2.08758\n", "Current loss value: 16.7337\n", "Current loss value: 45.1779\n", "Current loss value: 80.5223\n", "Current loss value: 127.737\n", "Current loss value: 172.732\n", "Current loss value: 211.114\n", "Current loss value: 248.709\n", "Current loss value: 287.747\n", "Current loss value: 327.232\n", "Current loss value: 373.602\n", "Current loss value: 413.74\n", "Current loss value: 454.809\n", "Current loss value: 499.868\n", "Current loss value: 534.588\n", "Current loss value: 579.301\n", "Current loss value: 618.316\n", "Current loss value: 661.576\n", "Current loss value: 700.36\n", "Current loss value: 741.439\n", "Filter 81 processed in 13s\n", "Processing filter 82\n", "Current loss value: 8.23286\n", "Current loss value: 34.7176\n", "Current loss value: 121.317\n", "Current loss value: 218.383\n", "Current loss value: 270.496\n", "Current loss value: 358.728\n", "Current loss value: 447.47\n", "Current loss value: 509.75\n", "Current loss value: 579.416\n", "Current loss value: 654.253\n", "Current loss value: 722.08\n", "Current loss value: 798.099\n", "Current loss value: 862.94\n", "Current loss value: 922.595\n", "Current loss value: 986.182\n", "Current loss value: 1043.05\n", "Current loss value: 1105.53\n", "Current loss value: 1174.13\n", "Current loss value: 1229.21\n", "Current loss value: 1287.74\n", "Filter 82 processed in 13s\n", "Processing filter 83\n", "Current loss value: 0.0\n", "Filter 83 processed in 1s\n", "Processing filter 84\n", "Current loss value: 12.6567\n", "Current loss value: 26.0367\n", "Current loss value: 52.8407\n", "Current loss value: 73.5631\n", "Current loss value: 119.921\n", "Current loss value: 179.602\n", "Current loss value: 211.916\n", "Current loss value: 276.094\n", "Current loss value: 281.709\n", "Current loss value: 359.436\n", "Current loss value: 382.722\n", "Current loss value: 439.246\n", "Current loss value: 477.857\n", "Current loss value: 522.688\n", "Current loss value: 539.339\n", "Current loss value: 602.637\n", "Current loss value: 635.7\n", "Current loss value: 674.589\n", "Current loss value: 724.183\n", "Current loss value: 779.455\n", "Filter 84 processed in 14s\n", "Processing filter 85\n", "Current loss value: 0.0\n", "Filter 85 processed in 1s\n", "Processing filter 86\n", "Current loss value: 6.26649\n", "Current loss value: 18.4397\n", "Current loss value: 68.9961\n", "Current loss value: 141.946\n", "Current loss value: 246.568\n", "Current loss value: 352.121\n", "Current loss value: 450.815\n", "Current loss value: 544.003\n", "Current loss value: 620.521\n", "Current loss value: 704.949\n", "Current loss value: 777.786\n", "Current loss value: 851.445\n", "Current loss value: 920.907\n", "Current loss value: 985.406\n", "Current loss value: 1054.62\n", "Current loss value: 1117.14\n", "Current loss value: 1181.55\n", "Current loss value: 1247.16\n", "Current loss value: 1312.64\n", "Current loss value: 1378.75\n", "Filter 86 processed in 13s\n", "Processing filter 87\n", "Current loss value: 37.1879\n", "Current loss value: 71.4348\n", "Current loss value: 125.352\n", "Current loss value: 181.436\n", "Current loss value: 242.02\n", "Current loss value: 300.458\n", "Current loss value: 360.19\n", "Current loss value: 412.441\n", "Current loss value: 464.375\n", "Current loss value: 522.061\n", "Current loss value: 573.067\n", "Current loss value: 626.774\n", "Current loss value: 676.771\n", "Current loss value: 732.29\n", "Current loss value: 769.447\n", "Current loss value: 822.825\n", "Current loss value: 862.998\n", "Current loss value: 920.412\n", "Current loss value: 955.243\n", "Current loss value: 1013.61\n", "Filter 87 processed in 13s\n", "Processing filter 88\n", "Current loss value: 0.267068\n", "Current loss value: 9.73664\n", "Current loss value: 53.7156\n", "Current loss value: 125.168\n", "Current loss value: 199.689\n", "Current loss value: 265.371\n", "Current loss value: 344.681\n", "Current loss value: 410.944\n", "Current loss value: 493.368\n", "Current loss value: 564.534\n", "Current loss value: 631.642\n", "Current loss value: 710.818\n", "Current loss value: 777.121\n", "Current loss value: 849.809\n", "Current loss value: 918.434\n", "Current loss value: 977.296\n", "Current loss value: 1049.52\n", "Current loss value: 1101.71\n", "Current loss value: 1181.28\n", "Current loss value: 1240.88\n", "Filter 88 processed in 14s\n", "Processing filter 89\n", "Current loss value: 0.911907\n", "Current loss value: 0.247465\n", "Current loss value: 3.70111\n", "Current loss value: 10.8941\n", "Current loss value: 21.6112\n", "Current loss value: 35.3046\n", "Current loss value: 46.8707\n", "Current loss value: 58.3421\n", "Current loss value: 80.9748\n", "Current loss value: 98.8327\n", "Current loss value: 127.458\n", "Current loss value: 159.682\n", "Current loss value: 187.594\n", "Current loss value: 222.13\n", "Current loss value: 251.086\n", "Current loss value: 280.261\n", "Current loss value: 310.287\n", "Current loss value: 342.911\n", "Current loss value: 374.67\n", "Current loss value: 412.569\n", "Filter 89 processed in 13s\n", "Processing filter 90\n", "Current loss value: 0.0\n", "Filter 90 processed in 1s\n", "Processing filter 91\n", "Current loss value: 1.21812\n", "Current loss value: 8.37927\n", "Current loss value: 27.6941\n", "Current loss value: 56.5473\n", "Current loss value: 92.5614\n", "Current loss value: 134.297\n", "Current loss value: 179.679\n", "Current loss value: 212.697\n", "Current loss value: 257.88\n", "Current loss value: 307.682\n", "Current loss value: 357.996\n", "Current loss value: 409.186\n", "Current loss value: 457.913\n", "Current loss value: 510.098\n", "Current loss value: 558.233\n", "Current loss value: 616.311\n", "Current loss value: 662.196\n", "Current loss value: 715.275\n", "Current loss value: 761.218\n", "Current loss value: 805.676\n", "Filter 91 processed in 13s\n", "Processing filter 92\n", "Current loss value: 0.51381\n", "Current loss value: 11.8307\n", "Current loss value: 35.5199\n", "Current loss value: 63.6662\n", "Current loss value: 97.263\n", "Current loss value: 133.29\n", "Current loss value: 173.144\n", "Current loss value: 222.542\n", "Current loss value: 270.453\n", "Current loss value: 318.703\n", "Current loss value: 357.135\n", "Current loss value: 396.428\n", "Current loss value: 435.295\n", "Current loss value: 474.985\n", "Current loss value: 511.302\n", "Current loss value: 548.097\n", "Current loss value: 585.001\n", "Current loss value: 622.493\n", "Current loss value: 658.748\n", "Current loss value: 692.943\n", "Filter 92 processed in 13s\n", "Processing filter 93\n", "Current loss value: 5.76657\n", "Current loss value: 22.1564\n", "Current loss value: 70.7565\n", "Current loss value: 130.536\n", "Current loss value: 209.584\n", "Current loss value: 277.752\n", "Current loss value: 346.922\n", "Current loss value: 402.452\n", "Current loss value: 474.64\n", "Current loss value: 530.268\n", "Current loss value: 587.905\n", "Current loss value: 633.298\n", "Current loss value: 690.739\n", "Current loss value: 745.368\n", "Current loss value: 800.854\n", "Current loss value: 850.994\n", "Current loss value: 906.425\n", "Current loss value: 957.729\n", "Current loss value: 1010.02\n", "Current loss value: 1063.66\n", "Filter 93 processed in 13s\n", "Processing filter 94\n", "Current loss value: 0.0\n", "Filter 94 processed in 1s\n", "Processing filter 95\n", "Current loss value: 56.6828\n", "Current loss value: 101.112\n", "Current loss value: 142.397\n", "Current loss value: 182.752\n", "Current loss value: 223.908\n", "Current loss value: 269.788\n", "Current loss value: 309.178\n", "Current loss value: 347.875\n", "Current loss value: 389.626\n", "Current loss value: 423.308\n", "Current loss value: 455.091\n", "Current loss value: 486.715\n", "Current loss value: 518.181\n", "Current loss value: 549.633\n", "Current loss value: 579.544\n", "Current loss value: 613.356\n", "Current loss value: 640.476\n", "Current loss value: 674.382\n", "Current loss value: 700.546\n", "Current loss value: 731.604\n", "Filter 95 processed in 13s\n", "Processing filter 96\n", "Current loss value: 0.0\n", "Filter 96 processed in 1s\n", "Processing filter 97\n", "Current loss value: 2.97165\n", "Current loss value: 44.034\n", "Current loss value: 98.9573\n", "Current loss value: 156.005\n", "Current loss value: 205.666\n", "Current loss value: 255.865\n", "Current loss value: 299.793\n", "Current loss value: 339.283\n", "Current loss value: 376.112\n", "Current loss value: 423.802\n", "Current loss value: 460.891\n", "Current loss value: 502.676\n", "Current loss value: 544.658\n", "Current loss value: 585.46\n", "Current loss value: 623.47\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 661.258\n", "Current loss value: 703.398\n", "Current loss value: 739.366\n", "Current loss value: 778.024\n", "Current loss value: 818.414\n", "Filter 97 processed in 13s\n", "Processing filter 98\n", "Current loss value: 0.191661\n", "Current loss value: 0.0698215\n", "Current loss value: 17.2666\n", "Current loss value: 53.1794\n", "Current loss value: 89.4344\n", "Current loss value: 120.211\n", "Current loss value: 161.623\n", "Current loss value: 202.728\n", "Current loss value: 234.619\n", "Current loss value: 273.113\n", "Current loss value: 308.839\n", "Current loss value: 341.941\n", "Current loss value: 377.561\n", "Current loss value: 417.559\n", "Current loss value: 460.843\n", "Current loss value: 516.541\n", "Current loss value: 574.423\n", "Current loss value: 631.959\n", "Current loss value: 682.094\n", "Current loss value: 738.046\n", "Filter 98 processed in 13s\n", "Processing filter 99\n", "Current loss value: 0.0\n", "Filter 99 processed in 1s\n", "Processing filter 100\n", "Current loss value: 0.0\n", "Filter 100 processed in 1s\n", "Processing filter 101\n", "Current loss value: 1.13327\n", "Current loss value: 8.79333\n", "Current loss value: 25.8912\n", "Current loss value: 58.0943\n", "Current loss value: 86.9341\n", "Current loss value: 122.686\n", "Current loss value: 165.701\n", "Current loss value: 200.414\n", "Current loss value: 240.199\n", "Current loss value: 276.359\n", "Current loss value: 315.827\n", "Current loss value: 357.861\n", "Current loss value: 406.171\n", "Current loss value: 439.814\n", "Current loss value: 475.58\n", "Current loss value: 508.603\n", "Current loss value: 545.818\n", "Current loss value: 577.532\n", "Current loss value: 609.084\n", "Current loss value: 638.744\n", "Filter 101 processed in 13s\n", "Processing filter 102\n", "Current loss value: 0.0\n", "Filter 102 processed in 1s\n", "Processing filter 103\n", "Current loss value: 1.84914\n", "Current loss value: 10.5446\n", "Current loss value: 42.205\n", "Current loss value: 118.14\n", "Current loss value: 216.426\n", "Current loss value: 309.372\n", "Current loss value: 390.826\n", "Current loss value: 472.451\n", "Current loss value: 543.897\n", "Current loss value: 622.771\n", "Current loss value: 678.864\n", "Current loss value: 745.402\n", "Current loss value: 806.098\n", "Current loss value: 871.275\n", "Current loss value: 926.601\n", "Current loss value: 987.916\n", "Current loss value: 1052.44\n", "Current loss value: 1115.28\n", "Current loss value: 1177.63\n", "Current loss value: 1242.3\n", "Filter 103 processed in 13s\n", "Processing filter 104\n", "Current loss value: 0.0\n", "Filter 104 processed in 1s\n", "Processing filter 105\n", "Current loss value: 0.0\n", "Filter 105 processed in 1s\n", "Processing filter 106\n", "Current loss value: 0.0\n", "Filter 106 processed in 1s\n", "Processing filter 107\n", "Current loss value: 0.463312\n", "Current loss value: 4.46086\n", "Current loss value: 29.3011\n", "Current loss value: 72.465\n", "Current loss value: 112.955\n", "Current loss value: 161.388\n", "Current loss value: 212.472\n", "Current loss value: 264.184\n", "Current loss value: 321.998\n", "Current loss value: 384.052\n", "Current loss value: 446.68\n", "Current loss value: 507.796\n", "Current loss value: 571.402\n", "Current loss value: 632.94\n", "Current loss value: 692.786\n", "Current loss value: 742.667\n", "Current loss value: 813.958\n", "Current loss value: 856.574\n", "Current loss value: 916.673\n", "Current loss value: 960.775\n", "Filter 107 processed in 13s\n", "Processing filter 108\n", "Current loss value: 0.0\n", "Filter 108 processed in 1s\n", "Processing filter 109\n", "Current loss value: 2.56363\n", "Current loss value: 16.4263\n", "Current loss value: 41.1221\n", "Current loss value: 80.2379\n", "Current loss value: 120.818\n", "Current loss value: 157.899\n", "Current loss value: 197.646\n", "Current loss value: 232.041\n", "Current loss value: 268.186\n", "Current loss value: 298.982\n", "Current loss value: 329.188\n", "Current loss value: 362.572\n", "Current loss value: 393.625\n", "Current loss value: 426.13\n", "Current loss value: 456.509\n", "Current loss value: 489.98\n", "Current loss value: 519.672\n", "Current loss value: 551.151\n", "Current loss value: 582.739\n", "Current loss value: 612.239\n", "Filter 109 processed in 13s\n", "Processing filter 110\n", "Current loss value: 9.97142\n", "Current loss value: 17.1819\n", "Current loss value: 49.5186\n", "Current loss value: 100.429\n", "Current loss value: 154.169\n", "Current loss value: 208.063\n", "Current loss value: 268.698\n", "Current loss value: 329.627\n", "Current loss value: 394.667\n", "Current loss value: 440.028\n", "Current loss value: 495.897\n", "Current loss value: 550.724\n", "Current loss value: 602.843\n", "Current loss value: 650.316\n", "Current loss value: 707.434\n", "Current loss value: 758.489\n", "Current loss value: 818.464\n", "Current loss value: 872.407\n", "Current loss value: 933.511\n", "Current loss value: 983.23\n", "Filter 110 processed in 13s\n", "Processing filter 111\n", "Current loss value: 0.0\n", "Filter 111 processed in 1s\n", "Processing filter 112\n", "Current loss value: 10.2015\n", "Current loss value: 19.5463\n", "Current loss value: 45.437\n", "Current loss value: 78.5875\n", "Current loss value: 113.886\n", "Current loss value: 150.662\n", "Current loss value: 183.583\n", "Current loss value: 210.162\n", "Current loss value: 239.399\n", "Current loss value: 267.13\n", "Current loss value: 295.685\n", "Current loss value: 321.539\n", "Current loss value: 350.339\n", "Current loss value: 374.996\n", "Current loss value: 399.442\n", "Current loss value: 426.875\n", "Current loss value: 452.33\n", "Current loss value: 480.193\n", "Current loss value: 506.016\n", "Current loss value: 530.99\n", "Filter 112 processed in 13s\n", "Processing filter 113\n", "Current loss value: 11.1774\n", "Current loss value: 31.8334\n", "Current loss value: 68.8965\n", "Current loss value: 117.679\n", "Current loss value: 151.605\n", "Current loss value: 198.228\n", "Current loss value: 201.734\n", "Current loss value: 268.464\n", "Current loss value: 271.518\n", "Current loss value: 315.063\n", "Current loss value: 342.634\n", "Current loss value: 385.548\n", "Current loss value: 401.888\n", "Current loss value: 448.793\n", "Current loss value: 475.882\n", "Current loss value: 509.24\n", "Current loss value: 523.636\n", "Current loss value: 553.638\n", "Current loss value: 592.056\n", "Current loss value: 621.07\n", "Filter 113 processed in 13s\n", "Processing filter 114\n", "Current loss value: 2.35068\n", "Current loss value: 14.0901\n", "Current loss value: 46.5804\n", "Current loss value: 94.6857\n", "Current loss value: 141.783\n", "Current loss value: 187.367\n", "Current loss value: 244.489\n", "Current loss value: 285.681\n", "Current loss value: 335.38\n", "Current loss value: 389.279\n", "Current loss value: 436.154\n", "Current loss value: 475.943\n", "Current loss value: 518.84\n", "Current loss value: 554.204\n", "Current loss value: 602.997\n", "Current loss value: 643.574\n", "Current loss value: 692.764\n", "Current loss value: 737.054\n", "Current loss value: 785.196\n", "Current loss value: 829.419\n", "Filter 114 processed in 13s\n", "Processing filter 115\n", "Current loss value: 0.0\n", "Filter 115 processed in 1s\n", "Processing filter 116\n", "Current loss value: 4.96894\n", "Current loss value: 38.9332\n", "Current loss value: 91.9403\n", "Current loss value: 135.953\n", "Current loss value: 170.405\n", "Current loss value: 207.251\n", "Current loss value: 243.166\n", "Current loss value: 271.135\n", "Current loss value: 301.601\n", "Current loss value: 328.644\n", "Current loss value: 361.924\n", "Current loss value: 390.274\n", "Current loss value: 415.625\n", "Current loss value: 444.927\n", "Current loss value: 468.694\n", "Current loss value: 495.088\n", "Current loss value: 521.452\n", "Current loss value: 546.938\n", "Current loss value: 574.38\n", "Current loss value: 598.704\n", "Filter 116 processed in 13s\n", "Processing filter 117\n", "Current loss value: 0.0\n", "Filter 117 processed in 1s\n", "Processing filter 118\n", "Current loss value: 0.113536\n", "Current loss value: 3.60542\n", "Current loss value: 17.7395\n", "Current loss value: 46.8482\n", "Current loss value: 74.6219\n", "Current loss value: 116.493\n", "Current loss value: 162.253\n", "Current loss value: 207.904\n", "Current loss value: 263.104\n", "Current loss value: 317.543\n", "Current loss value: 362.426\n", "Current loss value: 413.866\n", "Current loss value: 452.357\n", "Current loss value: 499.929\n", "Current loss value: 544.348\n", "Current loss value: 592.478\n", "Current loss value: 640.486\n", "Current loss value: 688.708\n", "Current loss value: 732.575\n", "Current loss value: 776.836\n", "Filter 118 processed in 13s\n", "Processing filter 119\n", "Current loss value: 2.77576\n", "Current loss value: 20.3402\n", "Current loss value: 60.9281\n", "Current loss value: 116.329\n", "Current loss value: 188.571\n", "Current loss value: 262.338\n", "Current loss value: 340.893\n", "Current loss value: 423.045\n", "Current loss value: 504.75\n", "Current loss value: 579.334\n", "Current loss value: 655.484\n", "Current loss value: 732.242\n", "Current loss value: 809.88\n", "Current loss value: 880.985\n", "Current loss value: 949.129\n", "Current loss value: 1016.91\n", "Current loss value: 1077.22\n", "Current loss value: 1142.81\n", "Current loss value: 1209.18\n", "Current loss value: 1274.24\n", "Filter 119 processed in 13s\n", "Processing filter 120\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 0.0902434\n", "Current loss value: 32.3288\n", "Current loss value: 85.7966\n", "Current loss value: 143.109\n", "Current loss value: 197.608\n", "Current loss value: 239.036\n", "Current loss value: 302.904\n", "Current loss value: 356.07\n", "Current loss value: 405.257\n", "Current loss value: 460.165\n", "Current loss value: 509.431\n", "Current loss value: 565.96\n", "Current loss value: 610.721\n", "Current loss value: 669.293\n", "Current loss value: 722.635\n", "Current loss value: 776.507\n", "Current loss value: 823.757\n", "Current loss value: 878.326\n", "Current loss value: 929.847\n", "Current loss value: 980.291\n", "Filter 120 processed in 14s\n", "Processing filter 121\n", "Current loss value: 3.72151\n", "Current loss value: 17.7027\n", "Current loss value: 44.1206\n", "Current loss value: 76.7585\n", "Current loss value: 112.071\n", "Current loss value: 146.527\n", "Current loss value: 184.728\n", "Current loss value: 226.23\n", "Current loss value: 265.205\n", "Current loss value: 298.061\n", "Current loss value: 331.622\n", "Current loss value: 365.5\n", "Current loss value: 397.776\n", "Current loss value: 435.164\n", "Current loss value: 468.928\n", "Current loss value: 499.171\n", "Current loss value: 528.66\n", "Current loss value: 555.133\n", "Current loss value: 583.843\n", "Current loss value: 616.599\n", "Filter 121 processed in 13s\n", "Processing filter 122\n", "Current loss value: 0.41298\n", "Current loss value: 0.887183\n", "Current loss value: 19.6641\n", "Current loss value: 55.2359\n", "Current loss value: 86.8922\n", "Current loss value: 125.769\n", "Current loss value: 164.194\n", "Current loss value: 197.453\n", "Current loss value: 239.939\n", "Current loss value: 282.15\n", "Current loss value: 325.312\n", "Current loss value: 369.47\n", "Current loss value: 408.237\n", "Current loss value: 447.438\n", "Current loss value: 481.299\n", "Current loss value: 523.029\n", "Current loss value: 560.336\n", "Current loss value: 599.475\n", "Current loss value: 635.39\n", "Current loss value: 673.99\n", "Filter 122 processed in 14s\n", "Processing filter 123\n", "Current loss value: 0.123309\n", "Current loss value: 0.694302\n", "Current loss value: 4.66033\n", "Current loss value: 14.9112\n", "Current loss value: 28.0061\n", "Current loss value: 50.1406\n", "Current loss value: 84.2142\n", "Current loss value: 123.735\n", "Current loss value: 166.519\n", "Current loss value: 201.414\n", "Current loss value: 241.437\n", "Current loss value: 278.092\n", "Current loss value: 316.471\n", "Current loss value: 345.762\n", "Current loss value: 379.566\n", "Current loss value: 417.066\n", "Current loss value: 446.275\n", "Current loss value: 479.067\n", "Current loss value: 507.277\n", "Current loss value: 538.157\n", "Filter 123 processed in 14s\n", "Processing filter 124\n", "Current loss value: 0.0\n", "Filter 124 processed in 1s\n", "Processing filter 125\n", "Current loss value: 0.945549\n", "Current loss value: 7.60979\n", "Current loss value: 27.2519\n", "Current loss value: 59.1133\n", "Current loss value: 97.6077\n", "Current loss value: 132.319\n", "Current loss value: 171.876\n", "Current loss value: 209.227\n", "Current loss value: 247.022\n", "Current loss value: 290.769\n", "Current loss value: 341.587\n", "Current loss value: 409.267\n", "Current loss value: 455.046\n", "Current loss value: 510.065\n", "Current loss value: 565.935\n", "Current loss value: 618.244\n", "Current loss value: 683.609\n", "Current loss value: 738.932\n", "Current loss value: 790.837\n", "Current loss value: 845.806\n", "Filter 125 processed in 13s\n", "Processing filter 126\n", "Current loss value: 26.2251\n", "Current loss value: 64.1732\n", "Current loss value: 122.906\n", "Current loss value: 207.514\n", "Current loss value: 292.914\n", "Current loss value: 364.968\n", "Current loss value: 434.101\n", "Current loss value: 505.259\n", "Current loss value: 577.6\n", "Current loss value: 651.884\n", "Current loss value: 721.185\n", "Current loss value: 783.484\n", "Current loss value: 853.044\n", "Current loss value: 920.444\n", "Current loss value: 984.678\n", "Current loss value: 1048.82\n", "Current loss value: 1114.72\n", "Current loss value: 1178.85\n", "Current loss value: 1246.12\n", "Current loss value: 1309.39\n", "Filter 126 processed in 13s\n", "Processing filter 127\n", "Current loss value: 25.6193\n", "Current loss value: 59.3624\n", "Current loss value: 104.323\n", "Current loss value: 151.083\n", "Current loss value: 205.038\n", "Current loss value: 254.423\n", "Current loss value: 294.191\n", "Current loss value: 333.585\n", "Current loss value: 370.44\n", "Current loss value: 408.176\n", "Current loss value: 441.16\n", "Current loss value: 475.456\n", "Current loss value: 507.069\n", "Current loss value: 540.435\n", "Current loss value: 571.523\n", "Current loss value: 602.711\n", "Current loss value: 630.593\n", "Current loss value: 660.558\n", "Current loss value: 685.105\n", "Current loss value: 718.288\n", "Filter 127 processed in 13s\n", "Processing filter 128\n", "Current loss value: 11.8041\n", "Current loss value: 33.4184\n", "Current loss value: 89.3109\n", "Current loss value: 161.365\n", "Current loss value: 239.067\n", "Current loss value: 324.353\n", "Current loss value: 393.965\n", "Current loss value: 471.123\n", "Current loss value: 534.207\n", "Current loss value: 601.114\n", "Current loss value: 668.335\n", "Current loss value: 731.978\n", "Current loss value: 796.619\n", "Current loss value: 855.815\n", "Current loss value: 917.469\n", "Current loss value: 978.473\n", "Current loss value: 1038.82\n", "Current loss value: 1101.26\n", "Current loss value: 1163.9\n", "Current loss value: 1223.58\n", "Filter 128 processed in 13s\n", "Processing filter 129\n", "Current loss value: 0.0197908\n", "Current loss value: 11.9263\n", "Current loss value: 47.9937\n", "Current loss value: 85.3673\n", "Current loss value: 123.603\n", "Current loss value: 168.454\n", "Current loss value: 220.959\n", "Current loss value: 274.216\n", "Current loss value: 341.133\n", "Current loss value: 397.998\n", "Current loss value: 461.482\n", "Current loss value: 516.766\n", "Current loss value: 573.877\n", "Current loss value: 630.691\n", "Current loss value: 684.526\n", "Current loss value: 747.39\n", "Current loss value: 796.828\n", "Current loss value: 856.227\n", "Current loss value: 909.691\n", "Current loss value: 964.128\n", "Filter 129 processed in 14s\n", "Processing filter 130\n", "Current loss value: 0.0554755\n", "Current loss value: 0.31422\n", "Current loss value: 3.47729\n", "Current loss value: 10.772\n", "Current loss value: 31.5762\n", "Current loss value: 53.9945\n", "Current loss value: 84.6858\n", "Current loss value: 109.495\n", "Current loss value: 136.047\n", "Current loss value: 158.337\n", "Current loss value: 185.799\n", "Current loss value: 212.545\n", "Current loss value: 243.597\n", "Current loss value: 274.513\n", "Current loss value: 304.421\n", "Current loss value: 334.25\n", "Current loss value: 365.592\n", "Current loss value: 396.166\n", "Current loss value: 424.687\n", "Current loss value: 452.832\n", "Filter 130 processed in 14s\n", "Processing filter 131\n", "Current loss value: 5.95226\n", "Current loss value: 22.4655\n", "Current loss value: 62.7959\n", "Current loss value: 153.255\n", "Current loss value: 222.934\n", "Current loss value: 280.019\n", "Current loss value: 341.76\n", "Current loss value: 400.056\n", "Current loss value: 450.113\n", "Current loss value: 502.025\n", "Current loss value: 554.391\n", "Current loss value: 594.453\n", "Current loss value: 640.489\n", "Current loss value: 690.86\n", "Current loss value: 743.556\n", "Current loss value: 783.734\n", "Current loss value: 832.844\n", "Current loss value: 881.246\n", "Current loss value: 925.319\n", "Current loss value: 970.528\n", "Filter 131 processed in 13s\n", "Processing filter 132\n", "Current loss value: 3.51945\n", "Current loss value: 26.3781\n", "Current loss value: 55.6724\n", "Current loss value: 90.7674\n", "Current loss value: 126.09\n", "Current loss value: 156.644\n", "Current loss value: 191.656\n", "Current loss value: 233.363\n", "Current loss value: 265.085\n", "Current loss value: 300.649\n", "Current loss value: 341.179\n", "Current loss value: 377.275\n", "Current loss value: 411.874\n", "Current loss value: 442.66\n", "Current loss value: 473.002\n", "Current loss value: 504.824\n", "Current loss value: 534.949\n", "Current loss value: 567.11\n", "Current loss value: 593.637\n", "Current loss value: 629.979\n", "Filter 132 processed in 13s\n", "Processing filter 133\n", "Current loss value: 15.1246\n", "Current loss value: 37.7011\n", "Current loss value: 86.3794\n", "Current loss value: 138.893\n", "Current loss value: 186.13\n", "Current loss value: 232.706\n", "Current loss value: 272.242\n", "Current loss value: 319.465\n", "Current loss value: 361.534\n", "Current loss value: 403.144\n", "Current loss value: 430.68\n", "Current loss value: 479.486\n", "Current loss value: 504.934\n", "Current loss value: 552.363\n", "Current loss value: 585.97\n", "Current loss value: 626.175\n", "Current loss value: 664.405\n", "Current loss value: 703.474\n", "Current loss value: 737.639\n", "Current loss value: 779.186\n", "Filter 133 processed in 13s\n", "Processing filter 134\n", "Current loss value: 0.0\n", "Filter 134 processed in 1s\n", "Processing filter 135\n", "Current loss value: 0.647408\n", "Current loss value: 8.22956\n", "Current loss value: 35.56\n", "Current loss value: 84.5487\n", "Current loss value: 132.883\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 181.084\n", "Current loss value: 227.495\n", "Current loss value: 280.743\n", "Current loss value: 321.163\n", "Current loss value: 362.984\n", "Current loss value: 396.477\n", "Current loss value: 435.102\n", "Current loss value: 470.707\n", "Current loss value: 509.246\n", "Current loss value: 545.459\n", "Current loss value: 577.666\n", "Current loss value: 610.95\n", "Current loss value: 643.016\n", "Current loss value: 673.296\n", "Current loss value: 709.283\n", "Filter 135 processed in 13s\n", "Processing filter 136\n", "Current loss value: 7.88142\n", "Current loss value: 35.3012\n", "Current loss value: 76.8522\n", "Current loss value: 128.254\n", "Current loss value: 182.636\n", "Current loss value: 235.804\n", "Current loss value: 293.274\n", "Current loss value: 345.261\n", "Current loss value: 396.504\n", "Current loss value: 449.938\n", "Current loss value: 505.6\n", "Current loss value: 562.774\n", "Current loss value: 615.957\n", "Current loss value: 670.463\n", "Current loss value: 729.23\n", "Current loss value: 787.897\n", "Current loss value: 848.149\n", "Current loss value: 911.842\n", "Current loss value: 972.662\n", "Current loss value: 1034.81\n", "Filter 136 processed in 13s\n", "Processing filter 137\n", "Current loss value: 12.9202\n", "Current loss value: 40.4863\n", "Current loss value: 97.4378\n", "Current loss value: 154.109\n", "Current loss value: 201.9\n", "Current loss value: 251.55\n", "Current loss value: 296.472\n", "Current loss value: 336.62\n", "Current loss value: 374.274\n", "Current loss value: 410.494\n", "Current loss value: 445.076\n", "Current loss value: 484.168\n", "Current loss value: 525.105\n", "Current loss value: 561.789\n", "Current loss value: 605.584\n", "Current loss value: 644.479\n", "Current loss value: 686.225\n", "Current loss value: 727.98\n", "Current loss value: 771.901\n", "Current loss value: 814.804\n", "Filter 137 processed in 13s\n", "Processing filter 138\n", "Current loss value: 38.9311\n", "Current loss value: 211.381\n", "Current loss value: 377.57\n", "Current loss value: 514.186\n", "Current loss value: 630.351\n", "Current loss value: 732.625\n", "Current loss value: 829.568\n", "Current loss value: 919.272\n", "Current loss value: 1019.32\n", "Current loss value: 1110.06\n", "Current loss value: 1198.47\n", "Current loss value: 1284.01\n", "Current loss value: 1369.48\n", "Current loss value: 1453.85\n", "Current loss value: 1542.0\n", "Current loss value: 1626.56\n", "Current loss value: 1712.29\n", "Current loss value: 1795.49\n", "Current loss value: 1881.6\n", "Current loss value: 1971.07\n", "Filter 138 processed in 14s\n", "Processing filter 139\n", "Current loss value: 11.7236\n", "Current loss value: 31.3423\n", "Current loss value: 76.3505\n", "Current loss value: 126.301\n", "Current loss value: 184.258\n", "Current loss value: 235.06\n", "Current loss value: 287.676\n", "Current loss value: 336.991\n", "Current loss value: 387.876\n", "Current loss value: 439.964\n", "Current loss value: 487.257\n", "Current loss value: 529.437\n", "Current loss value: 579.105\n", "Current loss value: 623.904\n", "Current loss value: 668.561\n", "Current loss value: 713.203\n", "Current loss value: 755.109\n", "Current loss value: 798.446\n", "Current loss value: 837.797\n", "Current loss value: 880.498\n", "Filter 139 processed in 13s\n", "Processing filter 140\n", "Current loss value: 11.5859\n", "Current loss value: 41.6294\n", "Current loss value: 69.8171\n", "Current loss value: 105.204\n", "Current loss value: 145.65\n", "Current loss value: 192.574\n", "Current loss value: 237.351\n", "Current loss value: 268.974\n", "Current loss value: 304.881\n", "Current loss value: 339.914\n", "Current loss value: 378.291\n", "Current loss value: 418.723\n", "Current loss value: 446.401\n", "Current loss value: 489.935\n", "Current loss value: 516.14\n", "Current loss value: 552.201\n", "Current loss value: 586.218\n", "Current loss value: 626.154\n", "Current loss value: 654.009\n", "Current loss value: 692.16\n", "Filter 140 processed in 13s\n", "Processing filter 141\n", "Current loss value: 1.68742\n", "Current loss value: 12.3836\n", "Current loss value: 49.7669\n", "Current loss value: 111.052\n", "Current loss value: 168.438\n", "Current loss value: 224.08\n", "Current loss value: 264.906\n", "Current loss value: 308.461\n", "Current loss value: 344.738\n", "Current loss value: 390.38\n", "Current loss value: 425.968\n", "Current loss value: 469.443\n", "Current loss value: 502.643\n", "Current loss value: 543.946\n", "Current loss value: 582.811\n", "Current loss value: 624.203\n", "Current loss value: 662.032\n", "Current loss value: 694.901\n", "Current loss value: 732.988\n", "Current loss value: 773.865\n", "Filter 141 processed in 14s\n", "Processing filter 142\n", "Current loss value: 0.0\n", "Filter 142 processed in 1s\n", "Processing filter 143\n", "Current loss value: 12.1595\n", "Current loss value: 32.2748\n", "Current loss value: 79.2924\n", "Current loss value: 150.246\n", "Current loss value: 218.104\n", "Current loss value: 287.593\n", "Current loss value: 356.417\n", "Current loss value: 436.654\n", "Current loss value: 511.082\n", "Current loss value: 593.729\n", "Current loss value: 680.667\n", "Current loss value: 749.448\n", "Current loss value: 828.641\n", "Current loss value: 896.443\n", "Current loss value: 968.436\n", "Current loss value: 1037.54\n", "Current loss value: 1107.38\n", "Current loss value: 1170.46\n", "Current loss value: 1235.44\n", "Current loss value: 1303.57\n", "Filter 143 processed in 13s\n", "Processing filter 144\n", "Current loss value: 0.0898419\n", "Current loss value: 0.150648\n", "Current loss value: 2.75563\n", "Current loss value: 18.0165\n", "Current loss value: 33.1226\n", "Current loss value: 60.1069\n", "Current loss value: 98.874\n", "Current loss value: 129.747\n", "Current loss value: 152.469\n", "Current loss value: 184.566\n", "Current loss value: 222.714\n", "Current loss value: 263.555\n", "Current loss value: 292.691\n", "Current loss value: 333.333\n", "Current loss value: 362.912\n", "Current loss value: 400.279\n", "Current loss value: 426.343\n", "Current loss value: 465.098\n", "Current loss value: 489.396\n", "Current loss value: 523.903\n", "Filter 144 processed in 13s\n", "Processing filter 145\n", "Current loss value: 13.175\n", "Current loss value: 30.3913\n", "Current loss value: 72.8809\n", "Current loss value: 116.546\n", "Current loss value: 156.849\n", "Current loss value: 191.748\n", "Current loss value: 230.06\n", "Current loss value: 266.13\n", "Current loss value: 304.529\n", "Current loss value: 346.562\n", "Current loss value: 387.388\n", "Current loss value: 430.142\n", "Current loss value: 474.051\n", "Current loss value: 518.006\n", "Current loss value: 560.407\n", "Current loss value: 606.124\n", "Current loss value: 646.831\n", "Current loss value: 686.839\n", "Current loss value: 725.066\n", "Current loss value: 766.601\n", "Filter 145 processed in 14s\n", "Processing filter 146\n", "Current loss value: 12.474\n", "Current loss value: 37.9123\n", "Current loss value: 80.1126\n", "Current loss value: 130.767\n", "Current loss value: 196.943\n", "Current loss value: 245.332\n", "Current loss value: 304.477\n", "Current loss value: 350.778\n", "Current loss value: 395.687\n", "Current loss value: 447.175\n", "Current loss value: 490.49\n", "Current loss value: 536.538\n", "Current loss value: 580.624\n", "Current loss value: 630.69\n", "Current loss value: 682.28\n", "Current loss value: 735.236\n", "Current loss value: 794.704\n", "Current loss value: 848.864\n", "Current loss value: 909.203\n", "Current loss value: 960.84\n", "Filter 146 processed in 13s\n", "Processing filter 147\n", "Current loss value: 0.0\n", "Filter 147 processed in 1s\n", "Processing filter 148\n", "Current loss value: 0.105561\n", "Current loss value: 0.687443\n", "Current loss value: 5.80797\n", "Current loss value: 13.3489\n", "Current loss value: 26.3876\n", "Current loss value: 37.8283\n", "Current loss value: 44.5144\n", "Current loss value: 53.701\n", "Current loss value: 76.3701\n", "Current loss value: 91.5203\n", "Current loss value: 113.158\n", "Current loss value: 139.188\n", "Current loss value: 166.052\n", "Current loss value: 189.952\n", "Current loss value: 218.902\n", "Current loss value: 250.241\n", "Current loss value: 280.748\n", "Current loss value: 315.399\n", "Current loss value: 352.1\n", "Current loss value: 387.871\n", "Filter 148 processed in 13s\n", "Processing filter 149\n", "Current loss value: 74.9676\n", "Current loss value: 117.43\n", "Current loss value: 200.55\n", "Current loss value: 274.051\n", "Current loss value: 343.151\n", "Current loss value: 405.213\n", "Current loss value: 465.805\n", "Current loss value: 514.183\n", "Current loss value: 560.85\n", "Current loss value: 607.33\n", "Current loss value: 652.317\n", "Current loss value: 692.677\n", "Current loss value: 743.878\n", "Current loss value: 782.132\n", "Current loss value: 832.347\n", "Current loss value: 856.757\n", "Current loss value: 914.511\n", "Current loss value: 947.934\n", "Current loss value: 984.161\n", "Current loss value: 1026.59\n", "Filter 149 processed in 14s\n", "Processing filter 150\n", "Current loss value: 20.5094\n", "Current loss value: 55.6082\n", "Current loss value: 107.273\n", "Current loss value: 162.508\n", "Current loss value: 198.223\n", "Current loss value: 254.749\n", "Current loss value: 302.909\n", "Current loss value: 344.082\n", "Current loss value: 389.951\n", "Current loss value: 431.126\n", "Current loss value: 472.181\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 516.773\n", "Current loss value: 559.79\n", "Current loss value: 599.739\n", "Current loss value: 641.155\n", "Current loss value: 681.45\n", "Current loss value: 717.172\n", "Current loss value: 758.94\n", "Current loss value: 797.47\n", "Current loss value: 838.921\n", "Filter 150 processed in 14s\n", "Processing filter 151\n", "Current loss value: 0.213318\n", "Current loss value: 0.0\n", "Filter 151 processed in 2s\n", "Processing filter 152\n", "Current loss value: 5.05307\n", "Current loss value: 28.1593\n", "Current loss value: 33.6059\n", "Current loss value: 61.738\n", "Current loss value: 92.8141\n", "Current loss value: 161.462\n", "Current loss value: 199.169\n", "Current loss value: 219.119\n", "Current loss value: 298.818\n", "Current loss value: 326.52\n", "Current loss value: 359.345\n", "Current loss value: 424.577\n", "Current loss value: 451.079\n", "Current loss value: 487.62\n", "Current loss value: 538.5\n", "Current loss value: 561.479\n", "Current loss value: 599.125\n", "Current loss value: 629.996\n", "Current loss value: 679.065\n", "Current loss value: 706.068\n", "Filter 152 processed in 14s\n", "Processing filter 153\n", "Current loss value: 1.0982\n", "Current loss value: 3.7379\n", "Current loss value: 12.001\n", "Current loss value: 59.6012\n", "Current loss value: 107.487\n", "Current loss value: 148.364\n", "Current loss value: 193.127\n", "Current loss value: 238.108\n", "Current loss value: 282.711\n", "Current loss value: 324.508\n", "Current loss value: 368.213\n", "Current loss value: 408.253\n", "Current loss value: 449.297\n", "Current loss value: 474.156\n", "Current loss value: 530.664\n", "Current loss value: 563.83\n", "Current loss value: 607.869\n", "Current loss value: 647.664\n", "Current loss value: 683.685\n", "Current loss value: 724.543\n", "Filter 153 processed in 13s\n", "Processing filter 154\n", "Current loss value: 18.5975\n", "Current loss value: 68.9119\n", "Current loss value: 131.831\n", "Current loss value: 198.807\n", "Current loss value: 264.801\n", "Current loss value: 327.715\n", "Current loss value: 381.751\n", "Current loss value: 428.175\n", "Current loss value: 474.773\n", "Current loss value: 530.529\n", "Current loss value: 571.768\n", "Current loss value: 626.078\n", "Current loss value: 677.22\n", "Current loss value: 725.922\n", "Current loss value: 769.272\n", "Current loss value: 809.197\n", "Current loss value: 856.681\n", "Current loss value: 895.518\n", "Current loss value: 938.823\n", "Current loss value: 975.738\n", "Filter 154 processed in 13s\n", "Processing filter 155\n", "Current loss value: 14.0555\n", "Current loss value: 31.0195\n", "Current loss value: 52.957\n", "Current loss value: 75.5244\n", "Current loss value: 98.0293\n", "Current loss value: 120.28\n", "Current loss value: 170.96\n", "Current loss value: 201.232\n", "Current loss value: 242.47\n", "Current loss value: 275.429\n", "Current loss value: 330.494\n", "Current loss value: 363.016\n", "Current loss value: 425.686\n", "Current loss value: 462.756\n", "Current loss value: 508.462\n", "Current loss value: 547.685\n", "Current loss value: 596.24\n", "Current loss value: 632.23\n", "Current loss value: 682.117\n", "Current loss value: 714.974\n", "Filter 155 processed in 13s\n", "Processing filter 156\n", "Current loss value: 0.0\n", "Filter 156 processed in 1s\n", "Processing filter 157\n", "Current loss value: 9.0774\n", "Current loss value: 50.9322\n", "Current loss value: 129.167\n", "Current loss value: 219.719\n", "Current loss value: 299.439\n", "Current loss value: 379.445\n", "Current loss value: 468.797\n", "Current loss value: 558.183\n", "Current loss value: 637.946\n", "Current loss value: 713.606\n", "Current loss value: 781.682\n", "Current loss value: 852.706\n", "Current loss value: 919.252\n", "Current loss value: 983.115\n", "Current loss value: 1053.61\n", "Current loss value: 1120.64\n", "Current loss value: 1182.01\n", "Current loss value: 1249.81\n", "Current loss value: 1313.52\n", "Current loss value: 1380.5\n", "Filter 157 processed in 13s\n", "Processing filter 158\n", "Current loss value: 7.62839\n", "Current loss value: 13.1455\n", "Current loss value: 26.3612\n", "Current loss value: 48.5208\n", "Current loss value: 91.7924\n", "Current loss value: 134.521\n", "Current loss value: 182.831\n", "Current loss value: 233.165\n", "Current loss value: 290.194\n", "Current loss value: 345.37\n", "Current loss value: 404.578\n", "Current loss value: 461.226\n", "Current loss value: 512.083\n", "Current loss value: 565.324\n", "Current loss value: 617.447\n", "Current loss value: 669.004\n", "Current loss value: 720.427\n", "Current loss value: 769.595\n", "Current loss value: 820.348\n", "Current loss value: 870.785\n", "Filter 158 processed in 13s\n", "Processing filter 159\n", "Current loss value: 81.5362\n", "Current loss value: 95.4105\n", "Current loss value: 116.505\n", "Current loss value: 144.808\n", "Current loss value: 172.381\n", "Current loss value: 191.195\n", "Current loss value: 212.734\n", "Current loss value: 236.957\n", "Current loss value: 256.505\n", "Current loss value: 283.213\n", "Current loss value: 302.159\n", "Current loss value: 325.513\n", "Current loss value: 346.822\n", "Current loss value: 373.243\n", "Current loss value: 393.632\n", "Current loss value: 419.296\n", "Current loss value: 437.405\n", "Current loss value: 462.961\n", "Current loss value: 491.109\n", "Current loss value: 503.725\n", "Filter 159 processed in 13s\n", "Processing filter 160\n", "Current loss value: 1.15952\n", "Current loss value: 17.3922\n", "Current loss value: 47.5851\n", "Current loss value: 86.6652\n", "Current loss value: 132.363\n", "Current loss value: 181.256\n", "Current loss value: 226.683\n", "Current loss value: 264.99\n", "Current loss value: 304.571\n", "Current loss value: 345.372\n", "Current loss value: 388.902\n", "Current loss value: 431.537\n", "Current loss value: 470.585\n", "Current loss value: 513.636\n", "Current loss value: 551.077\n", "Current loss value: 591.082\n", "Current loss value: 633.518\n", "Current loss value: 672.711\n", "Current loss value: 712.074\n", "Current loss value: 748.427\n", "Filter 160 processed in 13s\n", "Processing filter 161\n", "Current loss value: 5.17091\n", "Current loss value: 34.4184\n", "Current loss value: 84.3915\n", "Current loss value: 139.921\n", "Current loss value: 191.877\n", "Current loss value: 239.58\n", "Current loss value: 281.094\n", "Current loss value: 327.0\n", "Current loss value: 372.588\n", "Current loss value: 413.489\n", "Current loss value: 459.442\n", "Current loss value: 499.112\n", "Current loss value: 546.377\n", "Current loss value: 585.702\n", "Current loss value: 626.261\n", "Current loss value: 665.606\n", "Current loss value: 704.261\n", "Current loss value: 744.91\n", "Current loss value: 781.367\n", "Current loss value: 817.633\n", "Filter 161 processed in 13s\n", "Processing filter 162\n", "Current loss value: 0.113867\n", "Current loss value: 1.76728\n", "Current loss value: 10.0015\n", "Current loss value: 21.7895\n", "Current loss value: 35.3095\n", "Current loss value: 48.829\n", "Current loss value: 61.9834\n", "Current loss value: 78.1602\n", "Current loss value: 94.1211\n", "Current loss value: 112.06\n", "Current loss value: 131.478\n", "Current loss value: 148.206\n", "Current loss value: 167.793\n", "Current loss value: 187.201\n", "Current loss value: 204.627\n", "Current loss value: 231.415\n", "Current loss value: 254.66\n", "Current loss value: 278.502\n", "Current loss value: 304.385\n", "Current loss value: 328.419\n", "Filter 162 processed in 14s\n", "Processing filter 163\n", "Current loss value: 0.0\n", "Filter 163 processed in 1s\n", "Processing filter 164\n", "Current loss value: 0.0\n", "Filter 164 processed in 1s\n", "Processing filter 165\n", "Current loss value: 3.68111\n", "Current loss value: 17.9334\n", "Current loss value: 47.9843\n", "Current loss value: 85.294\n", "Current loss value: 125.737\n", "Current loss value: 160.926\n", "Current loss value: 213.58\n", "Current loss value: 251.561\n", "Current loss value: 291.31\n", "Current loss value: 325.946\n", "Current loss value: 364.271\n", "Current loss value: 405.69\n", "Current loss value: 442.073\n", "Current loss value: 479.254\n", "Current loss value: 515.757\n", "Current loss value: 554.988\n", "Current loss value: 592.602\n", "Current loss value: 630.048\n", "Current loss value: 675.336\n", "Current loss value: 716.121\n", "Filter 165 processed in 13s\n", "Processing filter 166\n", "Current loss value: 0.0827764\n", "Current loss value: 6.74735\n", "Current loss value: 20.8071\n", "Current loss value: 34.1314\n", "Current loss value: 64.4756\n", "Current loss value: 92.496\n", "Current loss value: 124.719\n", "Current loss value: 160.169\n", "Current loss value: 196.008\n", "Current loss value: 248.691\n", "Current loss value: 279.718\n", "Current loss value: 333.986\n", "Current loss value: 379.278\n", "Current loss value: 428.177\n", "Current loss value: 460.248\n", "Current loss value: 506.249\n", "Current loss value: 551.039\n", "Current loss value: 605.721\n", "Current loss value: 655.093\n", "Current loss value: 705.61\n", "Filter 166 processed in 13s\n", "Processing filter 167\n", "Current loss value: 32.0123\n", "Current loss value: 58.3991\n", "Current loss value: 101.27\n", "Current loss value: 142.235\n", "Current loss value: 174.153\n", "Current loss value: 211.2\n", "Current loss value: 242.071\n", "Current loss value: 276.892\n", "Current loss value: 306.029\n", "Current loss value: 326.592\n", "Current loss value: 362.713\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 381.773\n", "Current loss value: 413.888\n", "Current loss value: 440.044\n", "Current loss value: 471.658\n", "Current loss value: 497.951\n", "Current loss value: 533.059\n", "Current loss value: 560.631\n", "Current loss value: 594.689\n", "Current loss value: 624.129\n", "Filter 167 processed in 13s\n", "Processing filter 168\n", "Current loss value: 3.45529\n", "Current loss value: 17.9597\n", "Current loss value: 59.8018\n", "Current loss value: 106.852\n", "Current loss value: 162.199\n", "Current loss value: 210.219\n", "Current loss value: 262.751\n", "Current loss value: 311.918\n", "Current loss value: 361.851\n", "Current loss value: 413.56\n", "Current loss value: 453.604\n", "Current loss value: 497.516\n", "Current loss value: 538.957\n", "Current loss value: 579.673\n", "Current loss value: 619.737\n", "Current loss value: 661.35\n", "Current loss value: 702.872\n", "Current loss value: 748.879\n", "Current loss value: 788.822\n", "Current loss value: 832.784\n", "Filter 168 processed in 13s\n", "Processing filter 169\n", "Current loss value: 0.0\n", "Filter 169 processed in 1s\n", "Processing filter 170\n", "Current loss value: 17.4247\n", "Current loss value: 25.1649\n", "Current loss value: 68.6012\n", "Current loss value: 112.486\n", "Current loss value: 158.347\n", "Current loss value: 204.477\n", "Current loss value: 257.918\n", "Current loss value: 310.166\n", "Current loss value: 355.255\n", "Current loss value: 409.035\n", "Current loss value: 451.417\n", "Current loss value: 494.885\n", "Current loss value: 548.859\n", "Current loss value: 590.231\n", "Current loss value: 642.256\n", "Current loss value: 689.996\n", "Current loss value: 733.371\n", "Current loss value: 778.632\n", "Current loss value: 829.705\n", "Current loss value: 871.902\n", "Filter 170 processed in 13s\n", "Processing filter 171\n", "Current loss value: 50.724\n", "Current loss value: 117.67\n", "Current loss value: 196.31\n", "Current loss value: 285.857\n", "Current loss value: 357.296\n", "Current loss value: 423.328\n", "Current loss value: 467.027\n", "Current loss value: 527.064\n", "Current loss value: 563.522\n", "Current loss value: 623.537\n", "Current loss value: 679.472\n", "Current loss value: 720.054\n", "Current loss value: 775.486\n", "Current loss value: 816.226\n", "Current loss value: 872.037\n", "Current loss value: 919.051\n", "Current loss value: 966.807\n", "Current loss value: 1013.99\n", "Current loss value: 1064.29\n", "Current loss value: 1116.15\n", "Filter 171 processed in 13s\n", "Processing filter 172\n", "Current loss value: 0.0\n", "Filter 172 processed in 1s\n", "Processing filter 173\n", "Current loss value: 1.41919\n", "Current loss value: 29.7714\n", "Current loss value: 98.8805\n", "Current loss value: 174.763\n", "Current loss value: 262.967\n", "Current loss value: 358.726\n", "Current loss value: 460.228\n", "Current loss value: 551.423\n", "Current loss value: 639.872\n", "Current loss value: 717.61\n", "Current loss value: 794.437\n", "Current loss value: 863.734\n", "Current loss value: 937.328\n", "Current loss value: 1006.11\n", "Current loss value: 1079.78\n", "Current loss value: 1152.32\n", "Current loss value: 1221.82\n", "Current loss value: 1290.88\n", "Current loss value: 1364.67\n", "Current loss value: 1427.57\n", "Filter 173 processed in 13s\n", "Processing filter 174\n", "Current loss value: 2.97676\n", "Current loss value: 9.18804\n", "Current loss value: 27.3694\n", "Current loss value: 59.5394\n", "Current loss value: 110.111\n", "Current loss value: 165.312\n", "Current loss value: 219.812\n", "Current loss value: 289.442\n", "Current loss value: 346.422\n", "Current loss value: 404.528\n", "Current loss value: 470.92\n", "Current loss value: 524.428\n", "Current loss value: 577.847\n", "Current loss value: 633.703\n", "Current loss value: 678.643\n", "Current loss value: 726.693\n", "Current loss value: 781.151\n", "Current loss value: 827.458\n", "Current loss value: 872.114\n", "Current loss value: 916.268\n", "Filter 174 processed in 14s\n", "Processing filter 175\n", "Current loss value: 2.76869\n", "Current loss value: 33.3506\n", "Current loss value: 89.7546\n", "Current loss value: 147.03\n", "Current loss value: 208.264\n", "Current loss value: 263.22\n", "Current loss value: 319.398\n", "Current loss value: 371.025\n", "Current loss value: 423.54\n", "Current loss value: 481.586\n", "Current loss value: 536.198\n", "Current loss value: 589.226\n", "Current loss value: 642.548\n", "Current loss value: 689.68\n", "Current loss value: 735.948\n", "Current loss value: 789.485\n", "Current loss value: 835.268\n", "Current loss value: 887.573\n", "Current loss value: 936.906\n", "Current loss value: 985.386\n", "Filter 175 processed in 13s\n", "Processing filter 176\n", "Current loss value: 0.154817\n", "Current loss value: 16.0359\n", "Current loss value: 63.4644\n", "Current loss value: 119.482\n", "Current loss value: 167.465\n", "Current loss value: 218.239\n", "Current loss value: 258.87\n", "Current loss value: 304.739\n", "Current loss value: 349.967\n", "Current loss value: 395.75\n", "Current loss value: 437.192\n", "Current loss value: 480.481\n", "Current loss value: 519.922\n", "Current loss value: 560.172\n", "Current loss value: 601.405\n", "Current loss value: 643.95\n", "Current loss value: 684.896\n", "Current loss value: 727.147\n", "Current loss value: 776.65\n", "Current loss value: 855.977\n", "Filter 176 processed in 13s\n", "Processing filter 177\n", "Current loss value: 8.10026\n", "Current loss value: 20.3723\n", "Current loss value: 33.3446\n", "Current loss value: 51.3838\n", "Current loss value: 73.9811\n", "Current loss value: 113.298\n", "Current loss value: 159.172\n", "Current loss value: 219.28\n", "Current loss value: 275.075\n", "Current loss value: 326.917\n", "Current loss value: 395.021\n", "Current loss value: 444.958\n", "Current loss value: 510.353\n", "Current loss value: 565.586\n", "Current loss value: 620.138\n", "Current loss value: 674.435\n", "Current loss value: 723.164\n", "Current loss value: 777.304\n", "Current loss value: 829.803\n", "Current loss value: 882.499\n", "Filter 177 processed in 13s\n", "Processing filter 178\n", "Current loss value: 0.0\n", "Filter 178 processed in 1s\n", "Processing filter 179\n", "Current loss value: 0.0\n", "Filter 179 processed in 1s\n", "Processing filter 180\n", "Current loss value: 1.43743\n", "Current loss value: 21.0355\n", "Current loss value: 48.3323\n", "Current loss value: 75.3126\n", "Current loss value: 102.388\n", "Current loss value: 131.789\n", "Current loss value: 160.356\n", "Current loss value: 183.684\n", "Current loss value: 212.235\n", "Current loss value: 241.415\n", "Current loss value: 271.675\n", "Current loss value: 297.087\n", "Current loss value: 331.113\n", "Current loss value: 356.698\n", "Current loss value: 386.544\n", "Current loss value: 420.592\n", "Current loss value: 449.39\n", "Current loss value: 478.125\n", "Current loss value: 506.305\n", "Current loss value: 536.524\n", "Filter 180 processed in 14s\n", "Processing filter 181\n", "Current loss value: 11.4828\n", "Current loss value: 47.3974\n", "Current loss value: 105.928\n", "Current loss value: 156.671\n", "Current loss value: 200.05\n", "Current loss value: 245.781\n", "Current loss value: 278.973\n", "Current loss value: 319.908\n", "Current loss value: 352.107\n", "Current loss value: 390.481\n", "Current loss value: 425.138\n", "Current loss value: 454.508\n", "Current loss value: 490.884\n", "Current loss value: 520.906\n", "Current loss value: 560.854\n", "Current loss value: 590.612\n", "Current loss value: 630.896\n", "Current loss value: 666.412\n", "Current loss value: 705.941\n", "Current loss value: 739.936\n", "Filter 181 processed in 14s\n", "Processing filter 182\n", "Current loss value: 24.609\n", "Current loss value: 59.2556\n", "Current loss value: 177.322\n", "Current loss value: 308.108\n", "Current loss value: 429.604\n", "Current loss value: 534.172\n", "Current loss value: 629.824\n", "Current loss value: 730.719\n", "Current loss value: 828.627\n", "Current loss value: 930.078\n", "Current loss value: 1033.69\n", "Current loss value: 1135.79\n", "Current loss value: 1230.05\n", "Current loss value: 1320.94\n", "Current loss value: 1407.27\n", "Current loss value: 1494.2\n", "Current loss value: 1573.95\n", "Current loss value: 1659.27\n", "Current loss value: 1738.38\n", "Current loss value: 1822.76\n", "Filter 182 processed in 13s\n", "Processing filter 183\n", "Current loss value: 51.1109\n", "Current loss value: 100.535\n", "Current loss value: 116.871\n", "Current loss value: 136.013\n", "Current loss value: 174.593\n", "Current loss value: 211.697\n", "Current loss value: 241.951\n", "Current loss value: 275.505\n", "Current loss value: 304.9\n", "Current loss value: 342.57\n", "Current loss value: 375.295\n", "Current loss value: 411.228\n", "Current loss value: 439.392\n", "Current loss value: 474.973\n", "Current loss value: 505.649\n", "Current loss value: 552.678\n", "Current loss value: 581.069\n", "Current loss value: 616.796\n", "Current loss value: 668.015\n", "Current loss value: 691.115\n", "Filter 183 processed in 13s\n", "Processing filter 184\n", "Current loss value: 0.0\n", "Filter 184 processed in 1s\n", "Processing filter 185\n", "Current loss value: 2.80128\n", "Current loss value: 30.8752\n", "Current loss value: 75.5298\n", "Current loss value: 114.954\n", "Current loss value: 154.15\n", "Current loss value: 189.119\n", "Current loss value: 221.943\n", "Current loss value: 256.306\n", "Current loss value: 288.701\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Current loss value: 321.697\n", "Current loss value: 342.951\n", "Current loss value: 374.284\n", "Current loss value: 401.703\n", "Current loss value: 435.132\n", "Current loss value: 463.967\n", "Current loss value: 496.585\n", "Current loss value: 525.707\n", "Current loss value: 559.193\n", "Current loss value: 582.66\n", "Current loss value: 621.753\n", "Filter 185 processed in 14s\n", "Processing filter 186\n", "Current loss value: 1.75939\n", "Current loss value: 24.2455\n", "Current loss value: 74.225\n", "Current loss value: 125.663\n", "Current loss value: 169.741\n", "Current loss value: 220.574\n", "Current loss value: 259.427\n", "Current loss value: 307.558\n", "Current loss value: 353.976\n", "Current loss value: 396.187\n", "Current loss value: 434.653\n", "Current loss value: 473.838\n", "Current loss value: 506.599\n", "Current loss value: 547.797\n", "Current loss value: 580.162\n", "Current loss value: 616.903\n", "Current loss value: 653.013\n", "Current loss value: 689.849\n", "Current loss value: 726.351\n", "Current loss value: 768.149\n", "Filter 186 processed in 14s\n", "Processing filter 187\n", "Current loss value: 12.0646\n", "Current loss value: 76.6508\n", "Current loss value: 196.816\n", "Current loss value: 320.471\n", "Current loss value: 418.306\n", "Current loss value: 510.554\n", "Current loss value: 580.979\n", "Current loss value: 665.414\n", "Current loss value: 742.188\n", "Current loss value: 822.12\n", "Current loss value: 907.82\n", "Current loss value: 988.511\n", "Current loss value: 1057.46\n", "Current loss value: 1122.99\n", "Current loss value: 1191.23\n", "Current loss value: 1261.25\n", "Current loss value: 1332.66\n", "Current loss value: 1401.79\n", "Current loss value: 1473.75\n", "Current loss value: 1540.01\n", "Filter 187 processed in 16s\n", "Processing filter 188\n", "Current loss value: 3.87623\n", "Current loss value: 60.0137\n", "Current loss value: 127.853\n", "Current loss value: 194.89\n", "Current loss value: 250.809\n", "Current loss value: 317.642\n", "Current loss value: 374.487\n", "Current loss value: 428.768\n", "Current loss value: 475.083\n", "Current loss value: 533.316\n", "Current loss value: 576.684\n", "Current loss value: 621.37\n", "Current loss value: 663.839\n", "Current loss value: 711.441\n", "Current loss value: 757.011\n", "Current loss value: 799.49\n", "Current loss value: 844.527\n", "Current loss value: 886.612\n", "Current loss value: 924.807\n", "Current loss value: 963.494\n", "Filter 188 processed in 14s\n", "Processing filter 189\n", "Current loss value: 0.30009\n", "Current loss value: 1.56656\n", "Current loss value: 5.45774\n", "Current loss value: 19.9152\n", "Current loss value: 53.5901\n", "Current loss value: 104.224\n", "Current loss value: 149.154\n", "Current loss value: 191.653\n", "Current loss value: 231.599\n", "Current loss value: 286.798\n", "Current loss value: 329.835\n", "Current loss value: 365.02\n", "Current loss value: 403.127\n", "Current loss value: 438.815\n", "Current loss value: 472.309\n", "Current loss value: 509.146\n", "Current loss value: 537.102\n", "Current loss value: 572.399\n", "Current loss value: 599.199\n", "Current loss value: 631.927\n", "Filter 189 processed in 13s\n", "Processing filter 190\n", "Current loss value: 2.15169\n", "Current loss value: 17.2239\n", "Current loss value: 65.745\n", "Current loss value: 123.516\n", "Current loss value: 167.882\n", "Current loss value: 215.172\n", "Current loss value: 256.366\n", "Current loss value: 294.013\n", "Current loss value: 334.682\n", "Current loss value: 381.378\n", "Current loss value: 424.382\n", "Current loss value: 464.403\n", "Current loss value: 504.854\n", "Current loss value: 546.284\n", "Current loss value: 585.603\n", "Current loss value: 629.44\n", "Current loss value: 663.156\n", "Current loss value: 702.347\n", "Current loss value: 736.835\n", "Current loss value: 775.84\n", "Filter 190 processed in 13s\n", "Processing filter 191\n", "Current loss value: 0.108688\n", "Current loss value: 3.98611\n", "Current loss value: 14.0796\n", "Current loss value: 36.9284\n", "Current loss value: 69.0179\n", "Current loss value: 100.23\n", "Current loss value: 144.816\n", "Current loss value: 188.145\n", "Current loss value: 230.261\n", "Current loss value: 281.478\n", "Current loss value: 318.272\n", "Current loss value: 355.612\n", "Current loss value: 392.875\n", "Current loss value: 435.614\n", "Current loss value: 470.521\n", "Current loss value: 512.235\n", "Current loss value: 549.998\n", "Current loss value: 592.271\n", "Current loss value: 630.731\n", "Current loss value: 669.764\n", "Filter 191 processed in 14s\n", "Processing filter 192\n", "Current loss value: 7.30023\n", "Current loss value: 16.7512\n", "Current loss value: 38.9092\n", "Current loss value: 75.6695\n", "Current loss value: 111.596\n", "Current loss value: 155.479\n", "Current loss value: 191.515\n", "Current loss value: 229.797\n", "Current loss value: 269.255\n", "Current loss value: 301.512\n", "Current loss value: 341.537\n", "Current loss value: 379.735\n", "Current loss value: 415.016\n", "Current loss value: 447.389\n", "Current loss value: 479.157\n", "Current loss value: 516.385\n", "Current loss value: 552.58\n", "Current loss value: 586.694\n", "Current loss value: 623.472\n", "Current loss value: 659.867\n", "Filter 192 processed in 14s\n", "Processing filter 193\n", "Current loss value: 0.0\n", "Filter 193 processed in 1s\n", "Processing filter 194\n", "Current loss value: 0.0\n", "Filter 194 processed in 1s\n", "Processing filter 195\n", "Current loss value: 0.0161795\n", "Current loss value: 1.39424\n", "Current loss value: 12.9991\n", "Current loss value: 32.4331\n", "Current loss value: 52.519\n", "Current loss value: 70.9567\n", "Current loss value: 91.285\n", "Current loss value: 126.722\n", "Current loss value: 165.217\n", "Current loss value: 203.065\n", "Current loss value: 241.671\n", "Current loss value: 280.632\n", "Current loss value: 323.779\n", "Current loss value: 364.596\n", "Current loss value: 399.711\n", "Current loss value: 437.905\n", "Current loss value: 474.906\n", "Current loss value: 516.102\n", "Current loss value: 549.081\n", "Current loss value: 584.211\n", "Filter 195 processed in 14s\n", "Processing filter 196\n", "Current loss value: 0.0\n", "Filter 196 processed in 1s\n", "Processing filter 197\n", "Current loss value: 0.0\n", "Filter 197 processed in 1s\n", "Processing filter 198\n", "Current loss value: 0.0\n", "Filter 198 processed in 1s\n", "Processing filter 199\n", "Current loss value: 0.0\n", "Filter 199 processed in 1s\n" ] } ], "source": [ "'''Visualization of the filters of VGG16, via gradient ascent in input space.\n", "This script can run on CPU in a few minutes (with the TensorFlow backend).\n", "Results example: http://i.imgur.com/4nj4KjN.jpg\n", "'''\n", "from __future__ import print_function\n", "\n", "from scipy.misc import imsave\n", "import numpy as np\n", "import time\n", "from keras.applications import vgg16\n", "from keras import backend as K\n", "\n", "# dimensions of the generated pictures for each filter.\n", "img_width = 128\n", "img_height = 128\n", "\n", "# the name of the layer we want to visualize\n", "# (see model definition at keras/applications/vgg16.py)\n", "layer_name = 'block5_conv1'\n", "\n", "# util function to convert a tensor into a valid image\n", "\n", "\n", "def deprocess_image(x):\n", " # normalize tensor: center on 0., ensure std is 0.1\n", " x -= x.mean()\n", " x /= (x.std() + 1e-5)\n", " x *= 0.1\n", "\n", " # clip to [0, 1]\n", " x += 0.5\n", " x = np.clip(x, 0, 1)\n", "\n", " # convert to RGB array\n", " x *= 255\n", " if K.image_data_format() == 'channels_first':\n", " x = x.transpose((1, 2, 0))\n", " x = np.clip(x, 0, 255).astype('uint8')\n", " return x\n", "\n", "# defining the VGG16 model in Keras\n", "# load a set of weights pre-trained on ImageNet dataset\n", "# only go up to the last convolutional layer for arbitrary input sizes\n", "model = vgg16.VGG16(weights='imagenet', include_top=False)\n", "print('Model loaded.')\n", "\n", "model.summary()\n", "\n", "# this is the placeholder for the input images\n", "input_img = model.input\n", "\n", "# get the symbolic outputs of each \"key\" layer (we gave them unique names).\n", "layer_dict = dict([(layer.name, layer) for layer in model.layers[1:]])\n", "\n", "\n", "def normalize(x):\n", " # utility function to normalize a tensor by its L2 norm\n", " return x / (K.sqrt(K.mean(K.square(x))) + 1e-5)\n", "\n", "\n", "kept_filters = []\n", "for filter_index in range(0, 200):\n", " # we only scan through the first 200 filters,\n", " # but there are actually 512 of them\n", " print('Processing filter %d' % filter_index)\n", " start_time = time.time()\n", "\n", " # we build a loss function that maximizes the activation of\n", " # the nth filter (filter_index) of the layer considered (layer_name)\n", " layer_output = layer_dict[layer_name].output\n", " if K.image_data_format() == 'channels_first':\n", " loss = K.mean(layer_output[:, filter_index, :, :])\n", " else:\n", " loss = K.mean(layer_output[:, :, :, filter_index])\n", "\n", " # we compute the gradient of the input picture wrt this loss\n", " grads = K.gradients(loss, input_img)[0]\n", "\n", " # normalization trick: we normalize the gradient\n", " grads = normalize(grads)\n", "\n", " # this function returns the loss and grads given the input picture\n", " iterate = K.function([input_img], [loss, grads])\n", "\n", " # step size for gradient ascent\n", " step = 1.\n", "\n", " # we start from a gray image with some random noise\n", " if K.image_data_format() == 'channels_first':\n", " input_img_data = np.random.random((1, 3, img_width, img_height))\n", " else:\n", " input_img_data = np.random.random((1, img_width, img_height, 3))\n", " input_img_data = (input_img_data - 0.5) * 20 + 128\n", "\n", " # we run gradient ascent for 20 steps\n", " for i in range(20):\n", " loss_value, grads_value = iterate([input_img_data])\n", " input_img_data += grads_value * step\n", "\n", " print('Current loss value:', loss_value)\n", " if loss_value <= 0.:\n", " # some filters get stuck to 0, we can skip them\n", " break\n", "\n", " # decode the resulting input image\n", " if loss_value > 0:\n", " img = deprocess_image(input_img_data[0])\n", " kept_filters.append((img, loss_value))\n", " end_time = time.time()\n", " print('Filter %d processed in %ds' % (filter_index, end_time - start_time))\n", "\n", "# we will stich the best 64 filters on a 8 x 8 grid.\n", "n = 8\n", "\n", "# the filters that have the highest loss are assumed to be better-looking.\n", "# we will only keep the top 64 filters.\n", "kept_filters.sort(key=lambda x: x[1], reverse=True)\n", "kept_filters = kept_filters[:n * n]\n", "\n", "# build a black picture with enough space for\n", "# our 8 x 8 filters of size 128 x 128, with a 5px margin in between\n", "margin = 5\n", "width = n * img_width + (n - 1) * margin\n", "height = n * img_height + (n - 1) * margin\n", "stitched_filters = np.zeros((width, height, 3))\n", "\n", "# fill the picture with our saved filters\n", "for i in range(n):\n", " for j in range(n):\n", " img, loss = kept_filters[i * n + j]\n", " stitched_filters[(img_width + margin) * i: (img_width + margin) * i + img_width,\n", " (img_height + margin) * j: (img_height + margin) * j + img_height, :] = img\n", "\n", "# save the result to disk\n", "imsave('stitched_filters_%dx%d.png' % (n, n), stitched_filters)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Examples of different filters" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A remarkable observation: a lot of these filters are identical, but rotated by some non-random factor (typically 90 degrees). This means that we could potentially compress the number of filters used in a convnet by a large factor by finding a way to make the convolution filters rotation-invariant. I can see a few ways this could be achieved - it's an interesting research direction.\n", "\n", "Shockingly, the rotation observation holds true even for relatively high-level filters, such as those in conv4_1.\n", "\n", "In the highest layers we start to recognize textures similar to that found in the objects that network was trained to classify, such as feathers, eyes, etc." ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### Deep Dreaming" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Another fun thing to do is to apply these filters to photos (rather than to noisy all-gray inputs). This is the principle of Deep Dreams, popularized by Google last year. By picking specific combinations of filters rather than single filters, you can achieve quite pretty results. If you are interested in this, you could also check out the Deep Dream example in Keras, and the Google blog post that introduced the technique." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Artificial Neural Networks are very good at image classification, but we actually understand little of why certain models work and others don’t.\n", "One of the challenges is understanding what exactly goes on at each layer. We know that after training, each layer progressively extracts higher and higher-level features of the image, until the final layer essentially makes a decision on what the image shows. For example, the first layer maybe looks for edges or corners. Intermediate layers interpret the basic features to look for overall shapes or components, like a door or a leaf. The final few layers assemble those into complete interpretations - these neurons activate in response to very complex things such as entire buildings or trees. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We feed the network an arbitrary image or photo and let the network analyze the picture. We then pick a layer and ask the network to enhance whatever it detected.

\n", "Each layer of the network deals with features at a different level of abstraction, so the complexity of features we generate depends on which layer we choose to enhance. For example, lower layers tend to produce strokes or simple ornament-like patterns, because those layers are sensitive to basic features such as edges and their orientations.

\n", "If we choose higher-level layers, which identify more sophisticated features in images, complex features or even whole objects tend to emerge. We ask the network: \"Whatever you see there, I want more of it!\". So tell the network to maximize the activation of this layer." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Model loaded.\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/home/anne/anaconda3/envs/tensorflow/lib/python3.6/site-packages/scipy/ndimage/interpolation.py:600: UserWarning: From scipy 0.13.0, the output shape of zoom() is calculated with round() instead of int() - for these inputs the size of the returned array has changed.\n", " \"the returned array has changed.\", UserWarning)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Processing image shape (224, 336)\n", "..Loss value at 0 : 1.75582\n", "..Loss value at 1 : 2.32746\n", "..Loss value at 2 : 3.05959\n", "..Loss value at 3 : 3.78578\n", "..Loss value at 4 : 4.6106\n", "Processing image shape (314, 471)\n", "..Loss value at 0 : 2.75408\n", "..Loss value at 1 : 3.78937\n", "..Loss value at 2 : 4.82047\n", "Processing image shape (440, 660)\n", "..Loss value at 0 : 2.98697\n", "..Loss value at 1 : 3.94282\n", "..Loss value at 2 : 4.96125\n", "img/clouds.jpg\n", "Done.\n" ] } ], "source": [ "from __future__ import print_function\n", "\n", "from keras.preprocessing.image import load_img, img_to_array\n", "import numpy as np\n", "import scipy\n", "import argparse\n", "\n", "from keras.applications import inception_v3\n", "from keras import backend as K\n", "\n", "base_image_path = \"img/clouds.jpg\"\n", "result_prefix = \"results/clouds\"\n", "\n", "# These are the names of the layers\n", "# for which we try to maximize activation,\n", "# as well as their weight in the final loss\n", "# we try to maximize.\n", "# You can tweak these setting to obtain new visual effects.\n", "settings = {\n", " 'features': {\n", " 'mixed2': 0.2,\n", " 'mixed3': 0.5,\n", " 'mixed4': 2.,\n", " 'mixed5': 1.5,\n", " },\n", "}\n", "\n", "\n", "def preprocess_image(image_path):\n", " # Util function to open, resize and format pictures\n", " # into appropriate tensors.\n", " img = load_img(image_path)\n", " img = img_to_array(img)\n", " img = np.expand_dims(img, axis=0)\n", " img = inception_v3.preprocess_input(img)\n", " return img\n", "\n", "\n", "def deprocess_image(x):\n", " # Util function to convert a tensor into a valid image.\n", " if K.image_data_format() == 'channels_first':\n", " x = x.reshape((3, x.shape[2], x.shape[3]))\n", " x = x.transpose((1, 2, 0))\n", " else:\n", " x = x.reshape((x.shape[1], x.shape[2], 3))\n", " x /= 2.\n", " x += 0.5\n", " x *= 255.\n", " x = np.clip(x, 0, 255).astype('uint8')\n", " return x\n", "\n", "K.set_learning_phase(0)\n", "\n", "# Build the InceptionV3 network with our placeholder.\n", "# The model will be loaded with pre-trained ImageNet weights.\n", "model = inception_v3.InceptionV3(weights='imagenet',\n", " include_top=False)\n", "dream = model.input\n", "print('Model loaded.')\n", "\n", "# Get the symbolic outputs of each \"key\" layer (we gave them unique names).\n", "layer_dict = dict([(layer.name, layer) for layer in model.layers])\n", "\n", "# Define the loss.\n", "loss = K.variable(0.)\n", "for layer_name in settings['features']:\n", " # Add the L2 norm of the features of a layer to the loss.\n", " assert layer_name in layer_dict.keys(), 'Layer ' + layer_name + ' not found in model.'\n", " coeff = settings['features'][layer_name]\n", " x = layer_dict[layer_name].output\n", " # We avoid border artifacts by only involving non-border pixels in the loss.\n", " scaling = K.prod(K.cast(K.shape(x), 'float32'))\n", " if K.image_data_format() == 'channels_first':\n", " loss += coeff * K.sum(K.square(x[:, :, 2: -2, 2: -2])) / scaling\n", " else:\n", " loss += coeff * K.sum(K.square(x[:, 2: -2, 2: -2, :])) / scaling\n", "\n", "# Compute the gradients of the dream wrt the loss.\n", "grads = K.gradients(loss, dream)[0]\n", "# Normalize gradients.\n", "grads /= K.maximum(K.mean(K.abs(grads)), 1e-7)\n", "\n", "# Set up function to retrieve the value\n", "# of the loss and gradients given an input image.\n", "outputs = [loss, grads]\n", "fetch_loss_and_grads = K.function([dream], outputs)\n", "\n", "\n", "def eval_loss_and_grads(x):\n", " outs = fetch_loss_and_grads([x])\n", " loss_value = outs[0]\n", " grad_values = outs[1]\n", " return loss_value, grad_values\n", "\n", "\n", "def resize_img(img, size):\n", " img = np.copy(img)\n", " if K.image_data_format() == 'channels_first':\n", " factors = (1, 1,\n", " float(size[0]) / img.shape[2],\n", " float(size[1]) / img.shape[3])\n", " else:\n", " factors = (1,\n", " float(size[0]) / img.shape[1],\n", " float(size[1]) / img.shape[2],\n", " 1)\n", " return scipy.ndimage.zoom(img, factors, order=1)\n", "\n", "\n", "def gradient_ascent(x, iterations, step, max_loss=None):\n", " for i in range(iterations):\n", " loss_value, grad_values = eval_loss_and_grads(x)\n", " if max_loss is not None and loss_value > max_loss:\n", " break\n", " print('..Loss value at', i, ':', loss_value)\n", " x += step * grad_values\n", " return x\n", "\n", "\n", "def save_img(img, fname):\n", " pil_img = deprocess_image(np.copy(img))\n", " scipy.misc.imsave(fname, pil_img)\n", "\n", "\n", "\"\"\"Process:\n", "- Load the original image.\n", "- Define a number of processing scales (i.e. image shapes),\n", " from smallest to largest.\n", "- Resize the original image to the smallest scale.\n", "- For every scale, starting with the smallest (i.e. current one):\n", " - Run gradient ascent\n", " - Upscale image to the next scale\n", " - Reinject the detail that was lost at upscaling time\n", "- Stop when we are back to the original size.\n", "To obtain the detail lost during upscaling, we simply\n", "take the original image, shrink it down, upscale it,\n", "and compare the result to the (resized) original image.\n", "\"\"\"\n", "\n", "\n", "# Playing with these hyperparameters will also allow you to achieve new effects\n", "step = 0.01 # Gradient ascent step size\n", "num_octave = 3 # Number of scales at which to run gradient ascent\n", "octave_scale = 1.4 # Size ratio between scales\n", "iterations = 20 # Number of ascent steps per scale\n", "max_loss = 5.\n", "\n", "img = preprocess_image(base_image_path)\n", "if K.image_data_format() == 'channels_first':\n", " original_shape = img.shape[2:]\n", "else:\n", " original_shape = img.shape[1:3]\n", "successive_shapes = [original_shape]\n", "for i in range(1, num_octave):\n", " shape = tuple([int(dim / (octave_scale ** i)) for dim in original_shape])\n", " successive_shapes.append(shape)\n", "successive_shapes = successive_shapes[::-1]\n", "original_img = np.copy(img)\n", "shrunk_original_img = resize_img(img, successive_shapes[0])\n", "\n", "for shape in successive_shapes:\n", " print('Processing image shape', shape)\n", " img = resize_img(img, shape)\n", " img = gradient_ascent(img,\n", " iterations=iterations,\n", " step=step,\n", " max_loss=max_loss)\n", " upscaled_shrunk_original_img = resize_img(shrunk_original_img, shape)\n", " same_size_original = resize_img(original_img, shape)\n", " lost_detail = same_size_original - upscaled_shrunk_original_img\n", "\n", " img += lost_detail\n", " shrunk_original_img = resize_img(original_img, shape)\n", "\n", "save_img(img, fname=result_prefix + '.png')\n", "print(base_image_path)\n", "print(\"Done.\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Some Examples from Google" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

\n", "

\n", "

\n", "

\n", "" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### References\n", "XOR Tutorial: https://blog.thoughtram.io/machine-learning/2016/09/23/beginning-ml-with-keras-and-tensorflow.html

\n", "Data Tutorial: https://gist.github.com/ermaker/9be651e0117ff2595679

\n", "Visualize Filters: https://github.com/fchollet/keras/blob/master/examples/conv_filter_visualization.py

\n", "Deep Dreaming: https://github.com/fchollet/keras/blob/master/examples/deep_dream.py" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.1" } }, "nbformat": 4, "nbformat_minor": 2 }