Web
Analytics

Data Augmentation using Adversarial Networks for Tea Diseases Detection

  R. Sandra Yuwana (1*), Fani Fauziah (2), Ana Heryana (3), Dikdik Krisnandi (4), R. Budiarianto Suryo Kusumo (5), Hilman F. Pardede (6)

(1) Research Center for Informatics Indonesian Institute of Sciences - LIPI - Indonesia
(2) Research Institute for Tea and Cinchona (RITC) Indonesian Agency for Agricultural Research and Development - Indonesia
(3)  - Indonesia
(4) Research Center for Informatics Indonesian Institute of Sciences - LIPI - Indonesia
(5) Research Center for Informatics Indonesian Institute of Sciences - LIPI - Indonesia
(6) Research Center for Informatics Indonesian Institute of Sciences - LIPI - Indonesia
(*) Corresponding Author

Received: June 03, 2020; Revised: July 20, 2020
Accepted: August 05, 2020; Published: August 31, 2020


How to cite (IEEE): R. Yuwana, F. Fauziah, A. Heryana, D. Krisnandi, R. S. Kusumo,  and H. F. Pardede, "Data Augmentation using Adversarial Networks for Tea Diseases Detection," Jurnal Elektronika dan Telekomunikasi, vol. 20, no. 1, pp. 29-35, Aug. 2020. doi: 10.14203/jet.v20.29-35

Abstract

Deep learning technology has a better result when trained using an abundant amount of data. However, collecting such data is expensive and time consuming.  On the other hand, limited data often be the inevitable condition. To increase the number of data, data augmentation is usually implemented.  By using it, the original data are transformed, by rotating, shifting, or both, to generate new data artificially. In this paper, generative adversarial networks (GAN) and deep convolutional GAN (DCGAN) are used for data augmentation. Both approaches are applied for diseases detection. The performance of the tea diseases detection on the augmented data is evaluated using various deep convolutional neural network (DCNN) including AlexNet, DenseNet, ResNet, and Xception.  The experimental results indicate that the highest GAN accuracy is obtained by DenseNet architecture, which is 88.84%, baselines accuracy on the same architecture is 86.30%. The results of DCGAN accuracy on the use of the same architecture show a similar trend, which is 88.86%. 


  http://dx.doi.org/10.14203/jet.v20.29-35

Keywords


Tea diseases detection; augmentation data; GAN; DCGAN

Full Text:

  PDF

References


http://www.fao.org/3/y5143e/y5143e0z.htm, Accessed: Augst 5th, 2020

D. Krisnandi, H.F. Pardede, R.S. Yuwana, V. Zilvan, A. Heryana, F. Fauziah, and V.P. Rahadi, “Diseases classification for tea plant using concatenated convolution neural network,” CommIT (Communication and Information Technology) Journal, vol. 13, no. 2, pp. 67-77, 2019. Crossref

H. Lehmann-Danzinger, “Diseases and pests of tea: overview and possibilities of integrated pest and disease management,” Journal of Agriculture in the Tropics and Subtropics, vol. 101, no. 1, pp. 13-38, 2000.

A. Ramdan, E. Suryawati, R.B.S. Kusumo, H.F. Pardede, O. Mahendra, R. Dahlan, F. Fauziah, and H. Syahrian, “Deep CNNBased detection for tea clone identification,” Jurnal Elektronika dan Telekomunikasi, vol. 19, no. 2, pp. 45-50, 2019. Crossref

Q. Xiao, G. Li, L. Xie, and Q. Chen, “Real-world plant species identification based on deep convolutional neural networks and visual attention,” Ecological Informatics, vol. 48, pp. 117-124, 2018. Crossref

M. Turkoğlu, D. Hanbay, “Plant disease and pest detection using deep learning-based features,” Turkish Journal of Electrical Engineering & Computer Sciences, vol. 27, no. 3, pp. 1636-1651, 2019. Crossref

B. Liu, Y. Zhang, D.J. He, Y. Li, “Identification of apple leaf diseases based on deep convolutional neural networks,” Symmetry, vol. 10, no. 1, 2018. Crossref

E. Suryawati, V. Zilvan, R.S. Yuwana, A. Heryana, D. Rohdiana, H.F. Pardede, “Deep convolutional adversarial network-based feature learning for tea clones identifications,” in proc. 3rd International Conference on Informatics and Computational Sciences (ICICoS), 2019, pp. 1-5. Crossref

X. Ying, “An overview of overfitting and its solutions,” Journal of Physics: Conference Series, vol. 1168, no. 2, p. 022022. IOP Publishing, 2019. Crossref

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, Y. Bengio, “Generative adversarial nets,” Advances in Neural Information Processing Systems, pp. 2672-2680, 2014.

A. Radford, L. Metz, S. Chintala, “Unsupervised representation learning with deep convolutional generative adversarial networks,” arXiv preprint arXiv:1511.06434 (2015).

A. Krizhevsky, I. Sutskever, G.E. Hinton, “Imagenet classification with deep convolutional neural networks,” Advances in Neural Information Processing Systems, pp. 1097-1105, 2012.

A. Lumini, L. Nanni, “Ocean ecosystems plankton classification,” Recent Advances in Computer Vision, pp. 261-280, Springer, Cham, 2019.

G. Huang, Z. Liu, L. Van Der Maaten, K.Q. Weinberger, “Densely connected convolutional networks,” in Proc. of The IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 4700-4708. Crossref

K. He, X. Zhang, S. Ren, J. Sun, “Deep residual learning for image recognition,” in Proc. of The IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 770-778. Crossref

F. Chollet, “Xception: Deep learning with depthwise separable convolutions,” in Proc. of The IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 1251-1258. Crossref

C. Shorten, T.M. Khoshgoftaar, “A survey on image data augmentation for deep learning,” Journal of Big Data, 6(1):60, 2019. Crossref

L. Perez, J. Wang, “The effectiveness of data augmentation in image classification using deep learning,” arXiv preprint arXiv:1712.04621 (2017).

M. Prawira-Atmaja, H. Khomaini, H. Maulana, S. Harianto, D. Rohdiana, et al., “Changes in chlorophyll and polyphenols content in camellia sinensis var. sinensis at different stage of leaf maturity,” in IOP Conference Series: Earth and Environmental Science, vol. 131, p. 012010, IOP Publishing, 2018. Crossref


Article Metrics

Metrics Loading ...

Metrics powered by PLOS ALM

Refbacks

  • There are currently no refbacks.




Copyright (c) 2020 National Research and Innovation Agency

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.