Miles, Victoria and Giani, Stefano and Vogt, Oliver (2022) 'Recursive Encoder Network for the Automatic Analysis of STEP Files.', Journal of Intelligent Manufacturing .
Abstract
Automated tools which can understand and interface with CAD (computer-aided design) models are of significant research interest due to the potential for improving efficiency in manufacturing processes. At present, most research into the use of artificial intelligence to interpret threedimensional data takes input in the form of multiple twodimensional images of the object or in the form of threedimensional grids of voxels. The transformation of the input data necessary for these approaches inevitably leads to some loss of information and limitations of resolution. Existing research into the direct analysis of model files in STEP (standard for the exchange of product data) format tends to follow a rules-based approach to analyse models of a certain type, resulting in algorithms without the benefits of flexibility and complex understanding which artificial intelligence can provide. In this paper, a novel recursive encoder network for the automatic analysis of STEP files is presented. The encoder network is a flexible model with the potential for adaptation to a wide range of tasks and fine-tuning for specific CAD model datasets. Performance is evaluated using a machining feature classification task, with results showing accuracy approaching 100% and training time comparable to that of existing multi-view and voxel-based solutions without the need for a GPU.
Item Type: | Article |
---|---|
Full text: | Publisher-imposed embargo (AM) Accepted Manuscript File format - PDF (2561Kb) |
Full text: | (VoR) Version of Record Available under License - Creative Commons Attribution 4.0. Download PDF (2189Kb) |
Status: | Peer-reviewed |
Publisher Web site: | https://doi.org/10.1007/s10845-022-01998-x |
Publisher statement: | This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
Date accepted: | 15 July 2022 |
Date deposited: | 11 May 2022 |
Date of first online publication: | 07 August 2022 |
Date first made open access: | 16 August 2022 |
Save or Share this output
Export: | |
Look up in GoogleScholar |