Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Copyright(C) MYU K.K.
Published in advance: February 29,2024

Delineation of Clinical Target Volume of Esophageal Cancer Based on 3D Dense Network with Embedded Capsule Modules [PDF]

Yong Huang, Feixiang Zhang, Kai Xu, and Chengcheng Fan

(Received July 10, 2023; Accepted January 16,2024)

Keywords: deep learning, esophageal cancer, medical image processing, radiation therapy, target delineation

In this study, we propose a 3D dense network with embedded capsule modules (3D-DUCaps) for automatically delineating the clinical target volume of esophageal cancer, addressing the spatial dependence issue between parts and the whole that cannot be effectively captured by 2D networks. The network integrates capsule modules into the encoding layers of the U-Net to enhance feature learning capabilities and preserve more information, enabling the inference of poses and learning the relationship between parts and the whole. Additionally, dense connections are introduced to further promote the fusion of high-level semantic information and low-level feature information, enhancing the network's information propagation capabilities. Compared with traditional 2D deep learning networks, the proposed 3D deep learning network demonstrates stronger spatial awareness and superior boundary delineation capabilities, resulting in better delineation of the clinical target volume of esophageal cancer. Experimental results indicate that the 3D-DUCaps network achieves a 2.4% improvement in the Dice Similarity Coefficient metric compared with the classical 3D-UNet network.

Corresponding author: Xu Kai and Fan Chengcheng




Forthcoming Regular Issues


Forthcoming Special Issues

Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Data Sensing and Processing Technologies for Smart Community and Smart Life
Guest editor, Tatsuya Yamazaki (Niigata University)
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Piezoelectric Thin Films and Piezoelectric MEMS
Guest editor, Isaku Kanno (Kobe University)
Call for paper


Special Issue on Advanced Micro/Nanomaterials for Various Sensor Applications (Selected Papers from ICASI 2023)
Guest editor, Sheng-Joue Young (National United University)
Conference website
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.