GET THE APP

Textile Fabrics in an Optical Coherence Tomography Image Dataset
..

Journal of Textile Science & Engineering

ISSN: 2165-8064

Open Access

Mini Review - (2022) Volume 12, Issue 8

Textile Fabrics in an Optical Coherence Tomography Image Dataset

Kadir Ozlem*
*Correspondence: Kadir Ozlem, Department of Textile and Technology, Institute of Production Science, Karlsruhe, Germany, Email:
Department of Textile and Technology, Institute of Production Science, Karlsruhe, Germany

Received: 02-Aug-2022, Manuscript No. jtese-22-81709; Editor assigned: 04-Aug-2022, Pre QC No. P-81709; Reviewed: 16-Aug-2022, QC No. Q-81709; Revised: 21-Aug-2022, Manuscript No. R-81709; Published: 28-Aug-2022 , DOI: 10.37421/2165-8064.2022.12.498
Citation: Ozlem, Kadir. “Textile Fabrics in an Optical Coherence Tomography Image Dataset.” J Textile Sci Eng 12 (2022): 498.
Copyright: © 2022 Ozlem K. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Abstract

Since successful sorting of various materials is necessary for high-quality recycling, classification of material types is essential in the recycling industry. Wool, cotton, and polyester are the most frequently used fiber materials in textiles. It is essential to quickly and accurately identify and sort various fiber types when recycling fabrics. The burn test, followed by a microscopic examination, is the standard method for determining the type of fabric fiber material. Because it involves cutting, burning, and examining the fabric's yarn, this traditional method is time-consuming, destructive, and slow. With the help of deep learning and optical coherence tomography (OCT), we show that the identification procedure can be carried out in a nondestructive manner. A deep neural network is trained on the OCT image scans of fabrics made of wool, cotton, and polyester, among other fiber materials. The ability of the developed deep learning models to classify various types of fabric fiber materials is demonstrated by the results that we provide. OCT imaging and deep learning, according to our findings, enable the nondestructive identification of various fiber material types with high recall and precision. This novel method can be used automatically in recycling plants to sort wool, cotton, and polyester fabrics because OCT and deep learning can classify the material type.

Keywords

Optical coherence • Imaging • Wood • Polyester

Introduction

Various textile fabrics are captured by optical coherence tomography (OCT) images. Only one material was used to make each textile fabric: Fleece, cotton or polyester. For each material type, we took OCT images from three different fabrics, giving us a total of nine different fabrics. We conduct at least one hundred scans of each material at various locations on each surface. The scans for each image were saved in a portable network format and fixed to a 2 mm scan length so that the data were roughly the same between samples. The material data are categorized into three groups by us. Cotton, wool, and polyester fabrics were the only items in Groups 1, 2, and 3.The labelled dataset for deep learning training classes was created by organizing these in folders. This OCT fabric image dataset is made available to the public. The data can be used by researchers to train deep learning networks, test alreadyexisting machine learning algorithms, or create new systems for automatically classifying and recycling materials [1,2].

Discussion

Each Wool1, Wool2, and Wool3 folder contains OCT scans of three distinct fabrics made of pure wool fibers because we record OCT images taken at random locations on only one of the fabrics. Essentially, the Cotton1, Cotton2 and Cotton3 envelopes comprise of OCT outputs of three distinct textures produced using unadulterated cotton strands. OCT scans corresponding to three fabrics of pure polyester origin are contained in the folders Polyester1, Polyester, and Polyester3.The time and date information for each OCT image is recorded in the name of the corresponding portable network graphic file.

The trial intended to catch OCT pictures of material textures and produce a dataset of OCT filters comparing to texture materials made out of just fleece, cotton, or polyester strands. Therefore, conventional fire testing and subsequent microscopic observation were used to determine the fiber content of the fabrics prior to recording the OCT scans for the subsequent OCT measurement, we then retained only pure fabrics made of wool, cotton, or polyester fibers [3]. For each type of fiber, we combined three distinct fabrics to produce nine distinct samples. Using the OCT technique, each sample was individually measured on the sample arm. We used a laser diode with a central wavelength of 930 nm to acquire OCT images using the Thorlabs CAL110C1 imaging system. The diode produces broadband photons that enable speckle-free imaging. The ThorImage OCT software was used to record the OCT images. On the sample arm, we set the samples to be scanned. 2D picture examines, known as OCT-B filters, are gotten by checking the light pillar across the example's surface and adding the comparing OCT-A pictures each in turn [4,5].

Conclusion

All nine OCT-B images from nine samples were saved in a portable network graph format in a separate folder after being individually scanned using the OCT method. The scan path was set to 2 mm for each sample. This provided data that were roughly comparable across samples. OCT images are saved as raw OCT-B files using the ThorImage application. We do not apply any additional image processing or filtering and upload OCT images that have not been edited. We have provided 120 to 200 OCT scans for each item because the majority of deep learning algorithms require at least one hundred images per class for training.

Acknowledgement

None.

Conflict of Interest

None.

References

  1. Scataglini, Sofia, Stijn Verwulgen, Eddy Roosens and Damien Van Tiggelen. "Measuring Spatiotemporal Parameters on Treadmill Walking Using Wearable Inertial System." Sensors 21 (2021): 4441.
  2. Google Scholar, Crossref, Indexed at

  3. Lofterod, Bjorn, Terje Terjesen, Ann-Britt Huse and Reidun Jahnsen. "Preoperative gait analysis has a substantial effect on orthopedic decision making in children with cerebral palsy: comparison between clinical evaluation and gait analysis in 60 patients." Acta orthopaedica 78 (2007): 74-80.
  4. Google Scholar, Crossref, Indexed at

  5. Baker, Richard. "Gait analysis methods in rehabilitation." J Neuroeng Rehabil 3 (2006): 1-10.
  6. Crossref, Indexed at

  7. Simon, Sheldon R. "Quantification of human motion: gait analysis benefits and limitations to its application to clinical problems." J Biomech 37 (2004): 1869-1880.
  8. Google Scholar, Crossref, Indexed at

  9. Chambers, Henry G., and David H. Sutherland. "A practical guide to gait analysis." J Am Acad Orthop Surg 10 (2002): 222-231.
  10. Google Scholar, Crossref, Indexed at

Google Scholar citation report
Citations: 1008

Journal of Textile Science & Engineering received 1008 citations as per Google Scholar report

Journal of Textile Science & Engineering peer review process verified at publons

Indexed In

 
arrow_upward arrow_upward