基于 CNN 实现垃圾分类

代码模型 shijie ⋅ 于 3个月前 ⋅ 最后回复由 周筱筱 3个月前 ⋅ 369 阅读
来源:和鲸社区 作者:Pinecone628


目前全国各地开始施行垃圾分类。我们能不能通过日常所学的算法模型来实现一个简单的垃圾分类模型呢?

1.介绍
上海开始施行垃圾分类快两周啦。那么我们能不能通过平常学习的机器学习和深度学习的算法来实现一个简单的垃圾分类的模型呢? 下面主要用过CNN来实现垃圾的分类。在本数据集中,垃圾的种类有六种(和上海的标准不一样),分为玻璃、纸、硬纸板、塑料、金属、一般垃圾。 本文才有Keras来实现。


2.导入包和数据
In [1]:
import numpy as np
import matplotlib.pyplot as plt
from keras.preprocessing.image import ImageDataGenerator, load_img, img_to_array, array_to_img
from keras.layers import Conv2D, Flatten, MaxPooling2D, Dense
from keras.models import Sequential

import glob, os, random
Using TensorFlow backend.
file


In [2]:
base_path = '../input/trash_div7612/dataset-resized'
In [3]:
img_list = glob.glob(os.path.join(base_path, '/.jpg'))
我们总共有2527张图片。我们随机展示其中的6张图片。


In [4]:
print(len(img_list))
2527


In [5]:
for i, img_path in enumerate(random.sample(img_list, 6)):
img = load_img(img_path)
img = img_to_array(img, dtype=np.uint8)

plt.subplot(2, 3, i+1)
plt.imshow(img.squeeze())

file


3.对数据进行分组
In [6]:
train_datagen = ImageDataGenerator(
rescale=1./225, shear_range=0.1, zoom_range=0.1,
width_shift_range=0.1, height_shift_range=0.1, horizontal_flip=True,
vertical_flip=True, validation_split=0.1)

test_datagen = ImageDataGenerator(
rescale=1./255, validation_split=0.1)

train_generator = train_datagen.flow_from_directory(
base_path, target_size=(300, 300), batch_size=16,
class_mode='categorical', subset='training', seed=0)

validation_generator = test_datagen.flow_from_directory(
base_path, target_size=(300, 300), batch_size=16,
class_mode='categorical', subset='validation', seed=0)

labels = (train_generator.class_indices)
labels = dict((v,k) for k,v in labels.items())

print(labels)
Found 2276 images belonging to 6 classes.
Found 251 images belonging to 6 classes.
{0: 'cardboard', 1: 'glass', 2: 'metal', 3: 'paper', 4: 'plastic', 5: 'trash'}


4.模型的建立和训练

In [7]:
model = Sequential([
Conv2D(filters=32, kernel_size=3, padding='same', activation='relu', input_shape=(300, 300, 3)),
MaxPooling2D(pool_size=2),

Conv2D(filters=64, kernel_size=3, padding='same', activation='relu'),
MaxPooling2D(pool_size=2),

Conv2D(filters=32, kernel_size=3, padding='same', activation='relu'),
MaxPooling2D(pool_size=2),

Conv2D(filters=32, kernel_size=3, padding='same', activation='relu'),
MaxPooling2D(pool_size=2),

Flatten(),

Dense(64, activation='relu'),

Dense(6, activation='softmax')

])


In [8]:
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['acc'])
In [9]:
model.fit_generator(train_generator, epochs=100, steps_per_epoch=2276//32,validation_data=validation_generator,
validation_steps=251//32)
Epoch 1/100
71/71 [==============================] - 29s 404ms/step - loss: 1.7330 - acc: 0.2236 - val_loss: 1.6778 - val_acc: 0.3393
Epoch 2/100
71/71 [==============================] - 25s 359ms/step - loss: 1.5247 - acc: 0.3415 - val_loss: 1.4649 - val_acc: 0.3750
Epoch 3/100
71/71 [==============================] - 24s 344ms/step - loss: 1.4455 - acc: 0.4006 - val_loss: 1.4694 - val_acc: 0.3832
Epoch 4/100
71/71 [==============================] - 25s 348ms/step - loss: 1.3934 - acc: 0.4243 - val_loss: 1.5412 - val_acc: 0.3304
Epoch 5/100
71/71 [==============================] - 24s 341ms/step - loss: 1.3568 - acc: 0.4270 - val_loss: 1.4102 - val_acc: 0.4579
Epoch 6/100
71/71 [==============================] - 24s 344ms/step - loss: 1.2903 - acc: 0.4639 - val_loss: 1.3206 - val_acc: 0.4732
Epoch 7/100
71/71 [==============================] - 24s 341ms/step - loss: 1.3251 - acc: 0.4824 - val_loss: 1.2641 - val_acc: 0.4953
Epoch 8/100
71/71 [==============================] - 24s 344ms/step - loss: 1.1988 - acc: 0.5238 - val_loss: 1.3075 - val_acc: 0.4375
Epoch 9/100
71/71 [==============================] - 24s 343ms/step - loss: 1.2100 - acc: 0.5106 - val_loss: 1.4573 - val_acc: 0.4643
Epoch 10/100
71/71 [==============================] - 24s 343ms/step - loss: 1.1910 - acc: 0.5361 - val_loss: 1.2814 - val_acc: 0.4860
Epoch 11/100
71/71 [==============================] - 24s 340ms/step - loss: 1.1045 - acc: 0.5572 - val_loss: 1.2702 - val_acc: 0.5000
Epoch 12/100
71/71 [==============================] - 24s 342ms/step - loss: 1.0927 - acc: 0.5783 - val_loss: 1.1714 - val_acc: 0.5514
Epoch 13/100
71/71 [==============================] - 24s 344ms/step - loss: 1.1005 - acc: 0.5792 - val_loss: 1.1908 - val_acc: 0.5268
Epoch 14/100
71/71 [==============================] - 25s 347ms/step - loss: 1.0484 - acc: 0.6074 - val_loss: 1.2672 - val_acc: 0.4579
Epoch 15/100
71/71 [==============================] - 24s 338ms/step - loss: 1.0791 - acc: 0.5756 - val_loss: 1.1287 - val_acc: 0.5714
Epoch 16/100
71/71 [==============================] - 24s 342ms/step - loss: 1.0172 - acc: 0.6320 - val_loss: 1.0933 - val_acc: 0.5981
Epoch 17/100
71/71 [==============================] - 24s 340ms/step - loss: 0.9794 - acc: 0.6170 - val_loss: 1.2263 - val_acc: 0.5625
Epoch 18/100
71/71 [==============================] - 24s 342ms/step - loss: 0.9882 - acc: 0.6232 - val_loss: 1.0480 - val_acc: 0.6161
Epoch 19/100
71/71 [==============================] - 24s 339ms/step - loss: 0.9324 - acc: 0.6277 - val_loss: 1.3481 - val_acc: 0.6262
Epoch 20/100
71/71 [==============================] - 24s 341ms/step - loss: 0.9458 - acc: 0.6576 - val_loss: 1.2473 - val_acc: 0.5893
Epoch 21/100
71/71 [==============================] - 24s 337ms/step - loss: 1.0120 - acc: 0.6189 - val_loss: 1.0484 - val_acc: 0.5981
Epoch 22/100
71/71 [==============================] - 24s 342ms/step - loss: 0.8866 - acc: 0.6532 - val_loss: 0.9475 - val_acc: 0.6607
Epoch 23/100
71/71 [==============================] - 24s 337ms/step - loss: 0.9223 - acc: 0.6594 - val_loss: 1.0771 - val_acc: 0.5888
Epoch 24/100
71/71 [==============================] - 24s 342ms/step - loss: 0.8532 - acc: 0.6805 - val_loss: 1.1261 - val_acc: 0.5804
Epoch 25/100
71/71 [==============================] - 24s 336ms/step - loss: 0.8292 - acc: 0.6972 - val_loss: 1.0360 - val_acc: 0.6250
Epoch 26/100
71/71 [==============================] - 25s 348ms/step - loss: 0.8989 - acc: 0.6690 - val_loss: 1.1881 - val_acc: 0.5794
Epoch 27/100
71/71 [==============================] - 24s 340ms/step - loss: 0.8341 - acc: 0.6999 - val_loss: 1.0053 - val_acc: 0.5893
Epoch 28/100
71/71 [==============================] - 24s 340ms/step - loss: 0.8157 - acc: 0.6840 - val_loss: 1.0517 - val_acc: 0.6636
Epoch 29/100
71/71 [==============================] - 24s 339ms/step - loss: 0.8687 - acc: 0.6708 - val_loss: 0.8762 - val_acc: 0.6429
Epoch 30/100
71/71 [==============================] - 24s 342ms/step - loss: 0.8014 - acc: 0.6998 - val_loss: 1.0360 - val_acc: 0.5888
Epoch 31/100
71/71 [==============================] - 24s 338ms/step - loss: 0.7090 - acc: 0.7377 - val_loss: 0.8961 - val_acc: 0.6786
Epoch 32/100
71/71 [==============================] - 24s 339ms/step - loss: 0.8245 - acc: 0.7033 - val_loss: 0.8685 - val_acc: 0.6449
Epoch 33/100
71/71 [==============================] - 24s 335ms/step - loss: 0.7698 - acc: 0.7112 - val_loss: 0.9477 - val_acc: 0.6518
Epoch 34/100
71/71 [==============================] - 24s 340ms/step - loss: 0.7326 - acc: 0.7315 - val_loss: 0.8425 - val_acc: 0.6786
Epoch 35/100
71/71 [==============================] - 24s 335ms/step - loss: 0.7306 - acc: 0.7263 - val_loss: 1.0495 - val_acc: 0.7103
Epoch 36/100
71/71 [==============================] - 24s 339ms/step - loss: 0.7783 - acc: 0.7104 - val_loss: 1.0885 - val_acc: 0.6339
Epoch 37/100
71/71 [==============================] - 24s 337ms/step - loss: 0.7102 - acc: 0.7262 - val_loss: 0.7581 - val_acc: 0.6916
Epoch 38/100
71/71 [==============================] - 24s 339ms/step - loss: 0.6895 - acc: 0.7465 - val_loss: 0.8022 - val_acc: 0.7232
Epoch 39/100
71/71 [==============================] - 25s 346ms/step - loss: 0.6769 - acc: 0.7535 - val_loss: 0.8378 - val_acc: 0.7009
Epoch 40/100
71/71 [==============================] - 24s 340ms/step - loss: 0.7402 - acc: 0.7210 - val_loss: 0.8212 - val_acc: 0.6964
Epoch 41/100
71/71 [==============================] - 24s 334ms/step - loss: 0.7489 - acc: 0.7236 - val_loss: 1.0076 - val_acc: 0.6518
Epoch 42/100
71/71 [==============================] - 24s 341ms/step - loss: 0.6430 - acc: 0.7738 - val_loss: 0.9975 - val_acc: 0.6636
Epoch 43/100
71/71 [==============================] - 24s 335ms/step - loss: 0.6911 - acc: 0.7544 - val_loss: 0.8705 - val_acc: 0.7054
Epoch 44/100
71/71 [==============================] - 24s 340ms/step - loss: 0.6758 - acc: 0.7650 - val_loss: 0.8496 - val_acc: 0.6916
Epoch 45/100
71/71 [==============================] - 24s 335ms/step - loss: 0.6613 - acc: 0.7720 - val_loss: 1.0068 - val_acc: 0.6429
Epoch 46/100
71/71 [==============================] - 24s 338ms/step - loss: 0.5935 - acc: 0.7852 - val_loss: 0.8687 - val_acc: 0.6355
Epoch 47/100
71/71 [==============================] - 24s 335ms/step - loss: 0.6415 - acc: 0.7614 - val_loss: 0.8680 - val_acc: 0.6607
Epoch 48/100
71/71 [==============================] - 24s 336ms/step - loss: 0.6242 - acc: 0.7764 - val_loss: 0.8365 - val_acc: 0.7383
Epoch 49/100
71/71 [==============================] - 24s 334ms/step - loss: 0.5951 - acc: 0.7711 - val_loss: 0.8917 - val_acc: 0.7411
Epoch 50/100
71/71 [==============================] - 24s 336ms/step - loss: 0.5548 - acc: 0.7975 - val_loss: 0.9893 - val_acc: 0.6429
Epoch 51/100
71/71 [==============================] - 24s 343ms/step - loss: 0.6274 - acc: 0.7738 - val_loss: 0.7780 - val_acc: 0.7290
Epoch 52/100
71/71 [==============================] - 24s 339ms/step - loss: 0.5720 - acc: 0.7896 - val_loss: 0.7609 - val_acc: 0.7321
Epoch 53/100
71/71 [==============================] - 24s 336ms/step - loss: 0.6312 - acc: 0.7667 - val_loss: 0.9461 - val_acc: 0.6636
Epoch 54/100
71/71 [==============================] - 24s 339ms/step - loss: 0.6106 - acc: 0.7729 - val_loss: 0.7439 - val_acc: 0.7232
Epoch 55/100
71/71 [==============================] - 24s 335ms/step - loss: 0.5437 - acc: 0.8178 - val_loss: 0.7794 - val_acc: 0.7196
Epoch 56/100
71/71 [==============================] - 24s 341ms/step - loss: 0.5713 - acc: 0.8046 - val_loss: 0.9875 - val_acc: 0.6964
Epoch 57/100
71/71 [==============================] - 24s 337ms/step - loss: 0.5644 - acc: 0.7940 - val_loss: 1.0696 - val_acc: 0.6161
Epoch 58/100
71/71 [==============================] - 24s 338ms/step - loss: 0.5980 - acc: 0.7799 - val_loss: 0.8188 - val_acc: 0.7009
Epoch 59/100
71/71 [==============================] - 24s 336ms/step - loss: 0.5688 - acc: 0.7923 - val_loss: 0.7574 - val_acc: 0.7143
Epoch 60/100
71/71 [==============================] - 24s 337ms/step - loss: 0.5765 - acc: 0.7949 - val_loss: 0.8909 - val_acc: 0.6355
Epoch 61/100
71/71 [==============================] - 24s 335ms/step - loss: 0.5299 - acc: 0.8160 - val_loss: 0.7535 - val_acc: 0.7500
Epoch 62/100
71/71 [==============================] - 24s 337ms/step - loss: 0.5290 - acc: 0.8081 - val_loss: 1.1890 - val_acc: 0.6822
Epoch 63/100
71/71 [==============================] - 24s 335ms/step - loss: 0.5503 - acc: 0.7957 - val_loss: 0.6080 - val_acc: 0.7679
Epoch 64/100
71/71 [==============================] - 25s 348ms/step - loss: 0.5365 - acc: 0.8046 - val_loss: 0.8959 - val_acc: 0.6916
Epoch 65/100
71/71 [==============================] - 24s 336ms/step - loss: 0.5241 - acc: 0.8037 - val_loss: 1.0374 - val_acc: 0.6518
Epoch 66/100
71/71 [==============================] - 24s 338ms/step - loss: 0.4678 - acc: 0.8160 - val_loss: 0.6968 - val_acc: 0.7500
Epoch 67/100
71/71 [==============================] - 24s 336ms/step - loss: 0.4995 - acc: 0.8275 - val_loss: 0.8589 - val_acc: 0.7383
Epoch 68/100
71/71 [==============================] - 24s 341ms/step - loss: 0.4835 - acc: 0.8310 - val_loss: 0.8227 - val_acc: 0.6696
Epoch 69/100
71/71 [==============================] - 24s 336ms/step - loss: 0.5160 - acc: 0.8135 - val_loss: 0.8175 - val_acc: 0.7383
Epoch 70/100
71/71 [==============================] - 24s 338ms/step - loss: 0.5284 - acc: 0.8160 - val_loss: 0.9784 - val_acc: 0.6607
Epoch 71/100
71/71 [==============================] - 24s 334ms/step - loss: 0.5072 - acc: 0.8091 - val_loss: 0.7905 - val_acc: 0.7009
Epoch 72/100
71/71 [==============================] - 24s 340ms/step - loss: 0.4794 - acc: 0.8301 - val_loss: 0.7604 - val_acc: 0.7679
Epoch 73/100
71/71 [==============================] - 24s 332ms/step - loss: 0.4379 - acc: 0.8468 - val_loss: 0.9396 - val_acc: 0.7232
Epoch 74/100
71/71 [==============================] - 24s 338ms/step - loss: 0.5142 - acc: 0.8266 - val_loss: 0.7782 - val_acc: 0.6729
Epoch 75/100
71/71 [==============================] - 24s 333ms/step - loss: 0.4664 - acc: 0.8151 - val_loss: 1.3879 - val_acc: 0.6161
Epoch 76/100
71/71 [==============================] - 24s 337ms/step - loss: 0.4996 - acc: 0.8187 - val_loss: 0.7865 - val_acc: 0.7757
Epoch 77/100
71/71 [==============================] - 24s 344ms/step - loss: 0.4314 - acc: 0.8503 - val_loss: 1.1624 - val_acc: 0.5893
Epoch 78/100
71/71 [==============================] - 24s 338ms/step - loss: 0.4335 - acc: 0.8363 - val_loss: 0.6751 - val_acc: 0.7383
Epoch 79/100
71/71 [==============================] - 24s 333ms/step - loss: 0.4345 - acc: 0.8530 - val_loss: 0.8671 - val_acc: 0.6875
Epoch 80/100
71/71 [==============================] - 24s 336ms/step - loss: 0.4254 - acc: 0.8433 - val_loss: 0.6600 - val_acc: 0.7570
Epoch 81/100
71/71 [==============================] - 24s 335ms/step - loss: 0.4696 - acc: 0.8266 - val_loss: 0.8523 - val_acc: 0.6786
Epoch 82/100
71/71 [==============================] - 24s 335ms/step - loss: 0.5327 - acc: 0.7914 - val_loss: 0.8291 - val_acc: 0.6875
Epoch 83/100
71/71 [==============================] - 23s 330ms/step - loss: 0.4498 - acc: 0.8398 - val_loss: 0.8916 - val_acc: 0.7664
Epoch 84/100
71/71 [==============================] - 24s 337ms/step - loss: 0.4346 - acc: 0.8363 - val_loss: 0.9221 - val_acc: 0.6429
Epoch 85/100
71/71 [==============================] - 24s 332ms/step - loss: 0.4048 - acc: 0.8565 - val_loss: 0.7548 - val_acc: 0.7290
Epoch 86/100
71/71 [==============================] - 24s 335ms/step - loss: 0.4449 - acc: 0.8398 - val_loss: 1.0055 - val_acc: 0.7054
Epoch 87/100
71/71 [==============================] - 24s 331ms/step - loss: 0.4690 - acc: 0.8222 - val_loss: 1.0572 - val_acc: 0.6355
Epoch 88/100
71/71 [==============================] - 24s 335ms/step - loss: 0.3755 - acc: 0.8653 - val_loss: 0.8673 - val_acc: 0.6786
Epoch 89/100
71/71 [==============================] - 24s 343ms/step - loss: 0.4227 - acc: 0.8477 - val_loss: 1.0203 - val_acc: 0.6786
Epoch 90/100
71/71 [==============================] - 24s 334ms/step - loss: 0.4048 - acc: 0.8618 - val_loss: 0.9642 - val_acc: 0.6822
Epoch 91/100
71/71 [==============================] - 24s 333ms/step - loss: 0.4096 - acc: 0.8512 - val_loss: 0.7347 - val_acc: 0.7857
Epoch 92/100
71/71 [==============================] - 24s 336ms/step - loss: 0.3728 - acc: 0.8706 - val_loss: 0.9407 - val_acc: 0.6916
Epoch 93/100
71/71 [==============================] - 24s 332ms/step - loss: 0.4117 - acc: 0.8539 - val_loss: 0.8456 - val_acc: 0.7411
Epoch 94/100
71/71 [==============================] - 24s 339ms/step - loss: 0.4242 - acc: 0.8539 - val_loss: 1.2099 - val_acc: 0.6168
Epoch 95/100
71/71 [==============================] - 24s 332ms/step - loss: 0.4333 - acc: 0.8354 - val_loss: 0.7786 - val_acc: 0.7232
Epoch 96/100
71/71 [==============================] - 24s 334ms/step - loss: 0.3937 - acc: 0.8451 - val_loss: 1.0070 - val_acc: 0.7009
Epoch 97/100
71/71 [==============================] - 24s 332ms/step - loss: 0.3992 - acc: 0.8653 - val_loss: 0.9994 - val_acc: 0.6786
Epoch 98/100
71/71 [==============================] - 24s 335ms/step - loss: 0.3936 - acc: 0.8583 - val_loss: 0.7845 - val_acc: 0.7321
Epoch 99/100
71/71 [==============================] - 24s 332ms/step - loss: 0.4013 - acc: 0.8503 - val_loss: 0.6881 - val_acc: 0.7664
Epoch 100/100
71/71 [==============================] - 24s 335ms/step - loss: 0.3275 - acc: 0.8768 - val_loss: 0.9691 - val_acc: 0.6696
Out[9]:
<keras.callbacks.History at 0x7f0bb8d50828>


5.结果展示
下面我们随机抽取validation中的16张图片,展示图片以及其标签,并且给予我们的预测。 我们发现预测的准确度还是蛮高的,对于大部分图片,都能识别出其类别。

In [10]:
test_x, test_y = validation_generator.getitem(1)

preds = model.predict(test_x)

plt.figure(figsize=(16, 16))
for i in range(16):
plt.subplot(4, 4, i+1)
plt.title('pred:%s / truth:%s' % (labels[np.argmax(preds[i])], labels[np.argmax(test_y[i])]))
plt.imshow(test_x[i])


file

In [ ]:

成为第一个点赞的人吧 :bowtie:
回复数量: 1
  • 数据集和源码可以提供下么?

    3个月前
您需要登陆以后才能留下评论!