Previous Article in Journal
Viscoelastic, Shape Memory, and Fracture Characteristics of 3D-Printed Photosensitive Epoxy-Based Resin Under the Effect of Hydrothermal Ageing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Advanced Approaches to Material Processing in FFF 3D Printing: Integration of AR-Guided Maintenance for Optimized Manufacturing

Department of Computer Aided Manufacturing Technologies, Faculty of Manufacturing Technologies with a Seat in Presov, Technical University of Kosice, Bayerova 1, 08001 Presov, Slovakia
*
Author to whom correspondence should be addressed.
J. Manuf. Mater. Process. 2025, 9(2), 47; https://doi.org/10.3390/jmmp9020047
Submission received: 31 December 2024 / Revised: 24 January 2025 / Accepted: 29 January 2025 / Published: 3 February 2025

Abstract

:
The field of additive manufacturing increasingly demands innovative solutions to optimize material processing, improve equipment efficiency, and address maintenance challenges in high-utilization environments. This study investigates the operation and management of an FFF 3D printing production line comprising eight remotely controlled printers. The system supports custom manufacturing and educational activities, focusing on processing a range of thermoplastics and composite materials. A key contribution of this work lies in addressing the impact of frequent hardware servicing caused by shared use among users. Augmented reality (AR)-guided assembly and disassembly workflows were developed to ensure uninterrupted operations. These workflows are accessible via smart devices and provide step-by-step guidance tailored to specific material and equipment requirements. The research evaluates the effectiveness of AR-enhanced maintenance in minimizing downtime, extending equipment lifespans, and ensuring consistent material performance during manufacturing processes. Furthermore, it explores the role of AR in maintaining the mechanical, thermal, and chemical properties of processed materials, ensuring high-quality outputs across diverse applications. This paper highlights the integration of advanced material processing methodologies with emerging technologies like AR, aligning with the focus on enhancing manufacturing schemes. The findings contribute to improving process efficiency and adaptability in additive manufacturing, offering insights into scalable solutions for remote-controlled and multi-user production systems.

1. Introduction

In recent years, education has increasingly shifted into the digital realm, where technology plays a key role in improving learning methods and adapting educational processes to the needs of modern students. The use of innovations such as AR, the Internet of Things (IoT), and Industry 4.0 principles opens new possibilities for interactive and personalized learning. These technologies not only enable students to acquire new skills but also prepare them for challenges brought about by the rapidly changing technological and labor markets.
One of the most innovative technologies in education is AR, which allows students to immerse themselves interactively in the learning process through three-dimensional models and simulations. AR differs from traditional teaching methods, as students can visualize and manipulate objects that would otherwise be inaccessible or difficult to comprehend. In technical fields, such as engineering, medicine, and natural sciences, AR offers a unique opportunity for experimentation in a safe environment, allowing students to try various procedures without risk. This technology contributes to a deeper understanding of complex topics and enables practical experimentation under simulated conditions. Moreover, AR promotes active learning, engaging students in the process, leading to better information retention and the application of theoretical knowledge to real-world problems. By utilizing AR, education can gain a new dimension, making theoretical concepts more accessible and comprehensible. As illustrated in Figure 1, virtual and augmented reality technologies are currently a significant trend, which is also evident in the growing demand for these technologies in the industrial sector. According to available statistical data, the forecast indicates a continued upward trajectory.
Researchers worldwide have been engaged in this topic, with numerous studies highlighting the potential of modern technologies. For example, Murat Akçayır et al. (2017) conducted a systematic literature review on AR in education, identifying key factors such as AR advantages in improving learning outcomes, challenges related to technical issues and usability, and the increasing number of studies in this field [1]. Technologies like AR and VR enable educators to simulate real scenarios and create immersive learning environments, and they provide students with practical experiences that effectively link theory and practice, as explored by Abdullah M. Al-Ansi et al. (2023) [2]. Valentina Di Pasquale et al. (2024) examined the use of VR as a tool for operator training in assembly and disassembly tasks, evaluating its impact on improving performance, productivity, and safety while identifying existing gaps in its application, suggesting the need for further research for full utilization [3].
Recent studies highlight critical challenges in maintaining the mechanical integrity of 3D-printed parts, particularly the impact of voids on structural reliability. Gonabadi et al. (2024) investigated the size effects of voids, offering detailed insights into strategies to minimize defects and enhance material consistency. This aligns with the present study’s emphasis on leveraging AR-guided workflows to improve manufacturing outcomes [4]. A study by Buń et al. (2021) emphasized the role of AR in providing remote support and training for complex industrial operations [5]. Their findings highlighted AR’s ability to overlay real-time, task-relevant information on physical spaces, enhancing user efficiency in industrial environments. Similarly, Kurasov (2021) explored Industry 4.0 technologies, including AR, emphasizing its potential in creating “smart factories” where systems interact autonomously, significantly improving productivity [6]. Additionally, Aquino et al. (2023) proposed a model identifying critical factors influencing AR adoption in industrial contexts, such as task complexity, workforce skills, and technological readiness [7].
The IoT represents another significant technology that can transform how we learn. The IoT enables the connection of various devices and sensors that collect real-time data, providing valuable insights into a subject’s performance. This data collection can be used to personalize the teaching process, as educators can adapt content and tasks to individual needs. This approach enhances student interactions and customizes the learning process to their specific requirements. The IoT also facilitates devices that monitor student progress in real time, providing immediate feedback, which is critical for improving learning (for instance, monitoring progress in practical tasks using sensors or measuring time and accuracy during various operations). These data can be analyzed to identify areas where students need to improve their skills. A study by Valentina Terzieva et al. (2022) explores the role of the IoT in creating intelligent environments, particularly in education, demonstrating how IoT devices, including prototypes with communication protocols, can improve the educational process and contribute to personalized learning through real-time data collection and analysis [8]. Challenges in using the IoT in education, focusing on improving the quality of education through smart classrooms, e-learning, personalized learning, and security systems, were addressed in a publication by Tira Nur Fitria et al. (2023) that identified challenges such as high costs, technical needs, and cyber risks, along with proposing solutions for effective IoT implementation [9]. Finally, Industry 4.0 principles such as automation, robotics, and AR/VR significantly influence educational environments. The integration of advanced technologies familiarizes students with real-world industrial tools, bridging the gap between academic knowledge and practical skills. A study by Carlos Alberto Schettini Pinto et al. (2023) focuses on the application of Education 4.0 in Industry 4.0, emphasizing the need for further research in utilizing new educational technologies and highlighting the importance of connectivity and hybrid learning for the current and future generation [10]. Eduardo Moraes et al. (2022) pointed out the use of Industry 4.0 technologies in education, focusing on their contribution to learning, noting that these technologies are more commonly used at the university level [11]. Adel A. (2024) conducted a comprehensive analysis of smart education in the context of Industry 4.0/5.0 and its key contributions to developing innovative technologies and methodologies that can transform the educational environment and prepare students to collaborate with advanced technologies in the future while also addressing challenges such as privacy protection, digital divide, and ethical considerations in technology implementation [12].
Building upon previous research by the authors in the field of the IoT and its applications in “Maintenance 4.0”, significant efficiency improvements were demonstrated in AR-assisted interventions, such as spindle drive repairs on the Emco Mill 55 (EMCO GmbH, Hallein-Taxach, Austria), as can be seen in Figure 2. These interventions significantly enhanced the effectiveness of the manufacturing process. Leveraging these positive outcomes, this study aims to extend these technologies and processes to enhance the efficiency of an additive manufacturing line concept that prioritizes availability, remote accessibility, and universal applicability [13,14,15].
This research focuses on two primary aspects: the first explores IoT-enabled remote monitoring and management, which remains the subject of the ongoing study. The second, highlighted in this article, emphasizes accessibility and universal application. The manufacturing line is designed as an educational platform for training and familiarizing personnel and students with 3D printing technologies, particularly those who may lack prior knowledge on additive manufacturing, FFF devices, or materials science [16]. The presented process is versatile and capable of accommodating a wide range of materials, including fiber-reinforced composites. Gonabadi et al. (2024) provide insights into the effects of volume fraction, aspect ratio, and fiber type on mechanical performance, offering a framework for optimizing material properties. These findings demonstrate that, even with advanced materials like carbon fiber-reinforced filaments, the methodology described in this study would remain effective and applicable [17].
To address these challenges, AR sequences will be developed for accessible smart devices, guiding users through general material and FFF technology knowledge as well as complex maintenance tasks. For research purposes, a standardized sequence for replacing the print head on a test device (Ender 3v2) will be created in multiple variations and verified for its impact on user efficiency.

2. Materials and Methods

To ensure the successful implementation of AR technology and the evaluation of its efficiency, this study employed carefully selected hardware and software tools. As depicted in Figure 2, the Creality Ender 3v2 printer serves as a training unit for the 3D printing farm, which is also visible in the image. It is important to note that the utilized device is not part of the 3D printing farm; instead, it is designated solely to the creation of sequences, the implementation of AR technology, and research. In cases of successful and relevant results, these findings will be applied to the production line. The eight Ender 3v2 Neo printers in the 3D printing farm are interconnected with a server that individually monitors the printing process on each device. Through a Wi-Fi network, online monitoring via a webcam is possible, allowing for interventions in the printing process or its interruption. Additionally, power consumption is monitored, enabling automatic emergency shutdowns when necessary.

2.1. Materials

To ensure the successful implementation of AR technology and the evaluation of its efficiency, this study employed carefully selected hardware and software tools. As depicted in Figure 3, the Creality Ender 3v2 printer serves as a training unit for the 3D printing farm, which is also visible in the image.
It is important to note that the utilized device is not part of the production line; instead, it is designated solely to the creation of sequences, the implementation of AR technology, and research. In cases of successful and relevant results, these findings will be applied to the production line. The setup for verifying the efficiency of AR in the printing process consists of the following:
  • Creality Ender 3v2. The study used one unit of this cost-effective, Cartesian-style FFF 3D printer as a verification device for AR sequence development. This unit can be seen in Figure 3. Creality Ender 3v2 Neo: Eight units of this upgraded model, featuring enhanced hardware for improved reliability, were employed as part of the additive manufacturing line. Each printer included a Bowden-style extrusion system and a build volume of 220 × 220 × 260 mm.
  • Filament materials. A variety of thermoplastics and composites, including PLA, ABS, PETG, and carbon fiber-reinforced filaments, were utilized. These materials were selected for the mechanical, thermal, and chemical properties they featured that were relevant to diverse applications.
  • Augmented reality tools. The creation and implementation of AR-based assembly and maintenance workflows involves a combination of several specialized software solutions. Rather than relying on a single platform, this approach uses the synergy of multiple tools to prepare complex sequences that were optimized for smart devices, even those displaying lower hardware performances. The creation of the software component of the AR sequence consists of the following steps:
    • Model creation. The process begins with developing a CAD model using commonly available software; in this case, PTC Creo (v11.0.0) was employed. These models form the foundation for the AR sequences described in the subsequent sections.
    • Sequence design. PTC Illustrate (v6.3) was utilized to create snapshots within the sequence. This process resembles work in Microsoft PowerPoint, where each slide represents an action (complete with animations), time intervals for each operation, and warnings.
    • AR integration. Vuforia Studio (v9.23.12.0) combines the CAD models with vector data from PTC Illustrate, allowing for the creation of smooth animations that are visible in AR. It also facilitates the inclusion of additional information, such as warnings about high voltages, temperature alerts, step sequences, and other safety features.
    • AR visualization. Vuforia View (v9.23.2) enables the visualization of AR sequences on smart devices. Smart Devices: AR workflows were accessed via tablets and smartphones capable of running AR applications.
  • Hardware for Monitoring and Control. Towards the end of 2023, PTC ceased support for several Android, Surface, and Windows devices. Currently, the software supports Android devices running versions 11 to 15 and iOS devices from versions 16 to 18. For verification purposes, the AR output was created specifically for use on an Apple iPad 11 (Apple Inc., One Apple Park Way, Cupertino, CA, USA). The user interface of the output application was thus tailored to the parameters of this device.

2.2. Methods

This chapter is divided into two parts: the first justifies the selection and describes the preparation of the verification sequence for the disassembly and reassembly of the print head. The second part outlines the methodology for measuring and evaluating individual steps.

2.2.1. Process of Creating the AR Sequence

The development of AR sequences for maintenance guidance is a structured process involving multiple software tools and methodologies. This chapter details the individual phases of creating an AR sequence for the disassembly and reassembly of a 3D printer’s print head, as shown in Figure 4.
Phase 1 is the CAD model creation. The process begins with the development of a detailed CAD model of the print head assembly using PTC Creo Parametric. The model represents the physical components, including their structure and intricate details, to serve as the foundation for the AR sequence. While the specific modeling process is not elaborated on, it involves standard 3D modeling techniques based on physical measurements and visual references of the print head. The CAD model forms the baseline for all subsequent steps in creating the AR sequence.
Phase 2 involves the sequence design in PTC Illustrate. The second phase involves designing the AR sequence in PTC Illustrate. Here, the sequence is divided into individual steps, each represented by a snapshot. Each snapshot corresponds to a single action in the maintenance process. An example can be seen in Figure 5, where we can observe the following steps:
  • Step 1: Removal of screws and the print head cover.
  • Step 2: Detachment of the Bowden PTFE tube from the quick-connect fitting.
  • Step 3: Removal of the heating element and thermistor from the heat block, accompanied by warnings for the operator about preheating the print head and adhering to safety protocols to avoid burns.
The sequence contains 42 steps, providing detailed animations for screw removal, close-up views of small and hard-to-reach components, warnings about high temperatures or voltages, and other essential information. The objective is to enable even unqualified personnel to complete the maintenance tasks safely and efficiently. At the end of this phase, the sequence is saved in PVZ format, which is a compressed and detailed file format. From this, a PVI file is exported that contains vector data that are essential for integration into the next phase.
The vector data extracted from CAD files are critical for the AR sequence development process. Specifically, in the context of tools, vector data are encapsulated within PVI (Product View Instruction) files exported from PVZ archives. These files include geometric dimensions, animation paths, and component relationships. Such data ensure that the alignment of virtual components with physical models is precise, facilitating seamless interaction during AR visualizations. In Vuforia Studio, PVI files serve as a bridge between CAD models and AR workflows, enabling the integration of animations, timing, and user instructions. The structured data format allows for efficient optimization and scalability, particularly when preparing sequences for devices with varying hardware capabilities. The importance of PVI files and their applications in AR has been highlighted in PTC community resources, which document their role in streamlining AR experiences for technical maintenance tasks [18].
A part of this phase also involves a detailed examination of the AR interface design process. The AR interface design process employed in this study incorporates a user-centric approach, ensuring functionality, intuitiveness, and adaptability for various manufacturing applications. Using Vuforia Studio, the interface was designed to balance simplicity for end-users and robust capabilities for advanced interactions. The process follows these key steps:
  • Initial layout design. The interface begins with a 2D framework that outlines essential controls, such as model rotation, zoom, and sequence initiations. The placement of these controls adheres to ergonomic principles to minimize user effort and maximize clarity.
  • Integration of interactive elements. Interactive elements, such as buttons, sliders, and gesture-based controls, are embedded. Each element is bound to specific model actions, including rotation along axes, resetting positions, and toggling annotations. The flexibility of these components ensures compatibility with diverse AR devices, including mobile, tablet, and head-mounted displays.
  • Dynamic feedback mechanisms. To enhance user experience and operational accuracy, the interface integrates visual cues such as color-coded highlights, flashing elements, and progress indicators. These features provide immediate feedback, guiding users through each step of the process.
  • Testing and iterative refinement. Prototypes of the interface are tested in simulated environments to identify potential usability issues. Feedback from these tests informs iterative improvements, optimizing the interface for real-world applications.
The designed interface is demonstrated in Figure 6, showcasing both the development environment in Vuforia Studio and the live preview in augmented reality. This visualization highlights the intuitive arrangement of controls and their seamless integration with the AR environment.
In cases where materials, such as fiber-reinforced composites, are incorporated into the manufacturing process, this interface design approach remains applicable. The flexibility of the framework ensures that the interactive elements and visual feedback mechanisms adapt to accommodate the specific requirements of such materials. By combining user feedback with advanced software capabilities, this AR interface design process exemplifies a practical and scalable solution for integrating augmented reality into industrial workflows.
Phase 3 involves integration into Vuforia Studio. In this phase, the components from the previous steps are integrated within Vuforia Studio. This includes the following:
  • Importing the optimized CAD model assembly (in STEP format) with functional constraints.
  • Integrating PVI data containing animation details, timings, and warnings.
  • Configuring the display method, cloud accessibility, and user interface for the output device.
The integration process is user-friendly, requiring no advanced programming skills. Vuforia Studio offers features for CAD file optimization, such as generating models with varying polygon mesh qualities. This optimization is crucial for visualization on devices with lower hardware performance. Two display methods were configured for verification:
  • Positional Tracking with ThingMark: The marker’s position was deliberately set to prevent overlap between the physical and virtual models.
  • Model Tracker: The physical 3D printer model itself acts as a positional marker, achieving 100% overlap with the virtual model.
A third option, Spatial Targeting, was excluded due to issues with lighting conditions and the absence of depth sensors in many smart devices. Figure 7 illustrates the two implemented tracking methods.
Phase 4: Finalization in Vuforia View. The final phase involves creating the user interface in Vuforia View. This step assigns essential CAD functions (e.g., scale, rotate) to specific buttons to enhance visualization. Additionally, playback controls (e.g., play, pause, reset position) are configured for managing the sequence. The method of linking and interaction among the parts described in Phases 1 through 4 can be seen in Figure 8.
The resulting display of the AR sequence can be seen in Figure 9. View Figure 9a shows the beginning of the sequence, whereby hiding the print head cover, the user is alerted that this is a hot-end replacement. In view Figure 9b, the cover has already been removed, and view Figure 9c shows a close-up of the print head for better clarity.
The result is a comprehensive AR sequence comprising 21 steps for disassembly and 21 steps for reassembly of the print head, optimized for educational and maintenance purposes.

2.2.2. Verification Process

The accuracy and effectiveness of the AR-guided workflows were verified through a multi-stage evaluation process consisting of the aspects shown in Table 1.

2.2.3. Formulas and Calculations

The formulas used in this study, such as the Efficiency Improvement Ratio (EIR), Error Rate Reduction (ERR), and Overall Workflow Efficiency (OWE), were employed to quantify the impact of AR-guided workflows on maintenance performance. These formulas were selected based on their applicability to evaluating operational efficiency and reducing error rates in industrial processes. Similar approaches have been utilized in various studies to assess workflow optimization and process improvements in industrial and educational contexts.
The Efficiency Improvement Ratio (EIR): This formula measures the percentage improvement in task efficiency, comparing AR-assisted workflows to traditional methods. It highlights the time saved during maintenance tasks. The EIR was adapted from optimization frameworks in process management research [19].
Efficiency Improvement Ratio (EIR):
E I R = ( T m a n u a l T A R ) T m a n u a l × 100
where
Tmanual = Time for manual maintenance (minutes)
TAR = Time for AR-guided maintenance (minutes)
Error Rate Reduction (ERR): This formula evaluates the reduction in errors, focusing on the ability of AR guidance to minimize human mistakes during complex tasks. The ERR formula Overall Workflow Efficiency (OWE): OWE combines time efficiency and error correction metrics to provide a comprehensive view of workflow performance. This approach aligns with methodologies discussed in studies on Maintenance 4.0 and AR-enhanced manufacturing processes [20,21]. OWE is a metric used to evaluate the efficiency of operators during specific tasks. It incorporates factors such as task completion time, error rate, and overall accuracy to provide a holistic assessment of performance.
E R R = ( E m a n u a l E A R ) E m a n u a l × 100
where
Emanual = Error rate for manual process (errors per task);
EAR = Error rate for AR-guided processes (errors per task).
O W E = ( T t o t a l T e r r o r s × T c o r r e c t i o n ) T t o t a l × 100
where
Ttotal = Total time allocated for task completion (minutes);
Terrors = Number of errors recorded (count);
Tcorrection= Average time to correct an error (minutes).
In this study, the formulas were specifically applied to evaluate the following:
  • Time Efficiency: Measured by comparing the completion times for each test group.
  • Error Reduction: Analyzed through error rates recorded during assembly and disassembly tasks.
  • Workflow Optimization: Evaluated using the OWE formula to integrate both time and error corrections, providing a holistic efficiency metric.
These calculations provide actionable insights into the effectiveness of AR-guided workflows and their potential to enhance both educational and operational settings in additive manufacturing. The detailed statistical results are summarized in Table 2 and discussed in the following sections.

2.3. Practical Implementation of AR Sequences in Manufacturing

To demonstrate the practical application of the processes described in the article, we decided to include an example of AR solution implementation in industrial practice. This example serves as a complement to the main content of the article, providing a practical illustration of the methodology’s applicability in adjusting press arms, a process that requires high precision.
The process of adjusting press arms, which determines the final length of the produced components, was optimized through the implementation of HoloLens 2 (Microsoft Corporation, Redmond, Washington, USA) devices and the software tools Vuforia Studio and PTC Illustrate. This process was chosen as an example due to its complexity and the frequent need for collaboration among multiple workers.
The solution included:
  • Visualization of required values and technical data directly in the operator’s field of view.
  • Integration of annotation windows with step-by-step instructions and visual AR elements.
  • A reduction in errors and the elimination of the need for a second operator.
It is important to emphasize that the process of creating sequences and implementing AR elements described in the article regarding the example of 3D printers and print head disassembly is directly applicable to this case. This practical demonstration reflects not only theoretical aspects but also real-world-tested procedures for implementing AR technologies in manufacturing.

2.3.1. Process of Creating and Implementing AR Environments

The implementation involved:
  • Creating Digital Sequences: Using the Vuforia Studio and PTC Illustrate software, step-by-step workflows were designed, including visual elements and interactive annotations.
  • Optimization in Real Environments: AR elements were fine-tuned directly in the working environment to ensure that they were fully functional and intuitive for the end-user. An example of the process of adjusting press arms using AR can be seen in Figure 10.

2.3.2. Challenges and Benefits of Implementation

The key challenges included:
  • Device Compatibility: Identified issues related to the limited field of view of head-mounted displays and the impracticality of tablets in manufacturing conditions.
  • Visualization Stability: Ensuring that AR elements do not obstruct the operator and enhance spatial orientation.
These challenges were addressed through the optimization of visual elements and the integration of object tracking, ensuring smooth and efficient implementation. Additionally, the software also facilitates the creation of offline applications and the sharing of applications from cloud services. This flexibility ensures that direct ownership of the software is not always required, allowing it to function seamlessly from external sources, as demonstrated in the previously mentioned manufacturing implementation.

3. Results and Discussion

The analysis of AR-assisted workflows was conducted based on the results summarized in Table 1 and Table 2. These tables outline the task completion times, error rates, and efficiency metrics across the three test subjects: Subject 1 (ThingMark positional tracking), Subject 2 (Model/Object tracking), and Subject 3 (Manual guidance using PDF manuals). Table 3 summarizes the justification of performance differences between test groups based on previous results and measurements.

3.1. Efficiency Metrics

  • Completion times
    • The AR-assisted groups demonstrated significantly shorter task completion times compared to the manual guidance group. Subject 1 completed the sequence in 25 min, Subject 2 completed the sequence in 27 min, and Subject 3 completed the sequence in 40 min.
    • This reduction in time highlights the impact of real-time visual guidance and intuitive tracking methods in streamlining maintenance workflows. The 2 min difference observed in Subject 2 is attributed to insufficient skills in utilizing the sequence with 100% overlay.
  • Error rates
    • Subject 1 recorded an error rate of 5%, while Subject 2 had a slightly higher rate of 7%. Subject 3, relying on PDF manuals, exhibited a significantly higher error rate of 15%.
    • The reduced error rates in AR-assisted groups underscore the advantages of interactive guidance and model tracking, which minimize human errors, particularly in complex or unfamiliar tasks.
  • Efficiency metrics (EIR, ERR, and OWE)
    • The Efficiency Improvement Ratio (EIR): Subject 1 achieved a 37.5% improvement in task efficiency, while Subject 2 achieved 32.5%, demonstrating the time-saving potential of AR-guided workflows.
    • Error Rate Reduction (ERR): The ERR values of 66.7% for Subject 1 and 53.3% for Subject 2 highlight the effectiveness of AR in mitigating errors during maintenance.
    • Overall Workflow Efficiency (OWE): Both AR Subjects achieved an OWE of 85%, reflecting the balance between reduced errors and faster task completion.
    • Identical OWE values for Subject 1 and Subject 2, despite differing error rates, reflect the compensatory nature of the metric. For instance, Subject 1’s higher error rate was offset by faster task completion time, while Subject 2 demonstrated greater accuracy but at a slower pace. This highlights OWE’s ability to balance multiple performance parameters.

3.2. Comparative Analysis of Display Methods

  • ThingMark positional tracking:
    • This method demonstrated high spatial accuracy and seamless alignment with physical components. It enabled a clear overlay of virtual models, making it intuitive for users to follow each step.
    • The 5% error rate observed in this group reflects the stability and precision of marker-based tracking.
  • Model/object tracking:
    • While this method provided effective tracking, occasional misalignments in dynamic contexts led to a slightly higher error rate of 7%.
    • This tracking method is well suited for static environments but requires further optimization for scenarios involving frequent repositioning.
  • Manual Guidance:
    • The reliance on textual instructions and 2D illustrations significantly increased task complexity, leading to a higher completion time and error rate.
    • This approach lacks the interactivity and contextual clarity provided by AR guidance.

3.3. Impact of AR on Workflow Optimization

The integration of AR significantly enhanced both the efficiency and accuracy of maintenance tasks. Key factors contributing to this improvement include the following:
  • Real-Time Guidance: AR provided step-by-step instructions, reducing cognitive load and task confusion.
  • Interactive Warnings: Visual cues, such as alerts about high temperatures or improper tool usage, ensured adherence to safety protocols.
  • Enhanced Accessibility: AR sequences were accessible on smart devices, enabling on-demand guidance without the need for expert supervision.

3.4. Sustainability and Practical Implications

  • Reduction in Downtime: The faster task completion times and reduced error rates directly translate into minimized downtime, enhancing overall productivity.
  • Skill Development: The AR-guided workflows were particularly beneficial for inexperienced operators, providing a hands-on learning experience that bridges the gap between theory and practice.
  • Scalability: The modular nature of AR sequences allows for easy adaptation to different maintenance tasks, making this approach suitable for a wide range of applications.

3.5. Challenges and Future Work

  • User Feedback
The survey indicated that some users found the AR interface overwhelming initially, highlighting the need for intuitive design and user training. In conclusion, the results demonstrate the significant potential of AR-assisted workflows in improving maintenance efficiency and accuracy. Future studies should focus on refining tracking methods and expanding the application of AR in other operational contexts.
2
Future work
In conclusion, the results demonstrate the significant potential of AR-assisted workflows in improving maintenance efficiency and accuracy. Additionally, the integration of the Thingworx Composer module offers real-time monitoring capabilities for diagnostic or manufacturing devices, enabling the display of real-time data such as print head and bed temperatures during AR sequence playback. This feature enhances the safety of maintenance operations by providing operators with critical live data. The software also supports data logging and the configuration of predictive diagnostics thresholds, further streamlining maintenance processes. This integration of AR sequences with predictive diagnostics tools not only improves operational efficiency but also underscores the importance of AR-assisted interventions, which can be effectively employed during equipment failures. Future studies should focus on refining tracking methods and expanding the application of AR in other operational contexts.

4. Conclusions

The creation of AR sequences has proven to be a flexible and straightforward process, enabling the effective visualization and interaction of complex maintenance tasks. Through a structured methodology, including the integration of CAD models, animation data, and interactive warnings, the AR sequences have facilitated a streamlined workflow that is suitable for users with varying levels of expertise. This adaptability underlines the utility of AR in educational and industrial contexts.
The verification process demonstrated the significant advantages of AR-guided workflows, as evidenced by the reduced task completion times and error rates observed with Subjects 1 and 2. The findings underscore the critical role of AR in enhancing maintenance efficiency and ensuring consistent outcomes, even when performed by less experienced personnel. These results highlight the importance of integrating AR technologies into production systems to optimize performance and minimize errors.
Further progress in this field lies in the potential integration of AR sequences with IoT technologies. Real-time monitoring and data logging of parameters, such as print head temperature, offer new opportunities for predictive maintenance and fault estimation. This approach not only enhances safety during operations but also contributes to proactive system management. The synergy between IoT capabilities and AR-guided workflows has the potential to create a comprehensive tool for control, management, monitoring, and diagnostics. Such a system would enable AR-assisted interventions during equipment failures, providing efficient solutions even for operators with limited technical qualifications.
The findings of this study reinforce the transformative potential of AR and IoT integration in creating safer, more efficient, and user-friendly maintenance processes. Future research should continue to explore these synergies, further optimizing the technologies for broader industrial and educational applications.

Author Contributions

Conceptualization, J.K.; methodology, J.K. and P.G.; software and measurements J.K. and J.T.; validation, M.K. and J.T.; formal analysis, J.K.; investigation, J.K. and P.G.; resources, P.G. and J.T.; writing—original draft preparation, M.K.; review and editing, J.K.; visualization, J.K.; supervision, M.K.; funding acquisition, M.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by project KEGA 002TUKE-4/2023 and VEGA 1/0121/23, granted by the Ministry of Education, Research, Development and Youth of the Slovak Republic.

Data Availability Statement

Dataset available on request from the authors. The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
FFFFused filament fabrication
ARAugmented reality
VRVirtual reality
IoTInternet of thing
PLAPolylactic acid
ABSAcrylonitrile butadiene styrene
PETGPolyethylene terephthalate glycol
CADComputer aided design
PCPersonal computer
PDFPortable document format
EIREfficiency improvement ratio
ERRError rate reduction
OWEOverall workflow efficiency
PVIProduct View Instruction
PVZProduct View Zip
STEPStandard for the Exchange of Product Model Data
PTFEPolytetrafluoroethylene

References

  1. Akçayır, M.; Akçayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
  2. Al-Ansi, A.M.; Jaboob, M.; Garad, A.; Al-Ansi, A. Analyzing augmented reality (AR) and virtual reality (VR) recent development in education. Soc. Sci. Humanit. Open 2023, 8, 100532. [Google Scholar] [CrossRef]
  3. Di Pasquale, V.; Cutolo, P.; Esposito, C.; Franco, B.; Iannone, R.; Miranda, S. Virtual Reality for Training in Assembly and Disassembly Tasks: A Systematic Literature Review. Machines 2024, 12, 528. [Google Scholar] [CrossRef]
  4. Gonabadi, H.; Hosseini, S.F.; Chen, Y.; Bull, S. Size effects of voids on the mechanical properties of 3D printed parts. Int. J. Adv. Manuf. Technol. 2024, 132, 5439–5456. [Google Scholar] [CrossRef]
  5. Buń, P.; Grajewski, D.; Górski, F. Using augmented reality devices for remote support in manufacturing: A case study and analysis. Adv. Prod. Eng. Manag. 2021, 16, 418–430. [Google Scholar] [CrossRef]
  6. Kurasov, D.A. Computer-aided manufacturing: Industry 4.0. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1047, 012153. [Google Scholar] [CrossRef]
  7. Aquino, S.; Rapaccini, M.; Adrodegari, F.; Pezzotta, G. Augmented reality for industrial services provision: The factors influencing a successful adoption in manufacturing companies. J. Manuf. Technol. Manag. 2023, 34, 601–620. [Google Scholar] [CrossRef]
  8. Terzieva, V.; Ilchev, S.; Todorova, K. The Role of Internet of Things in Smart Education. IFAC-PapersOnLine 2022, 55, 108–113. [Google Scholar] [CrossRef]
  9. Fitria, T.N.; Simbolon, N.E. Internet of Things (IoT) in Education: Opportunities and Challenges. Pros. Semin. Nas. Call Pap. STIE AAS 2023, 6, 1–24. [Google Scholar]
  10. Pinto, C.A.S.; da Cunha Reis, A. Characteristics of Education 4.0 and its Application in Industry 4.0. J. Eng. Educ. Transform. 2023, 37, 51–61. [Google Scholar] [CrossRef]
  11. Moraes, E.B.; Kipper, L.M.; Hackenhaar Kellermann, A.C.; Austria, L.; Leivas, P.; Moraes, J.A.R.; Witczak, M. Integration of Industry 4.0 technologies with Education 4.0: Advantages for improvements in learning. Interact. Technol. Smart Educ. 2023, 20, 271–287. [Google Scholar] [CrossRef]
  12. Adel, A. The convergence of intelligent tutoring, robotics, and IoT in smart education for the transition from industry 4.0 to 5.0. Smart Cities 2024, 7, 14. [Google Scholar] [CrossRef]
  13. Trojanowska, J.; Kaščak, J.; Husár, J.; Knapčíková, L. Possibilities of increasing production efficiency by implementing elements of augmented reality. Bull. Pol. Acad. Sci. Tech. Sci. 2022, 70, e143831. [Google Scholar] [CrossRef]
  14. Kočiško, M.; Pollák, M.; Konečná, S.; Kaščak, J.; Svetlík, J. Integration of Augmented Reality and IoT Elements in the Maintenance 4.0. In International Conference Innovation in Engineering; Springer Nature: Cham, Switzerland, 2024; pp. 229–239. [Google Scholar] [CrossRef]
  15. Mascenik, J.; Coranic, T. Experimental determination of the coefficient of friction on a screw joint. Appl. Sci. 2022, 12, 11987. [Google Scholar] [CrossRef]
  16. Pollák, M.; Sabol, D.; Goryl, K. Measuring the Dimension Accuracy of Products Created by 3D Printing Technology with the Designed Measuring System. Machines 2024, 12, 884. [Google Scholar] [CrossRef]
  17. Gonabadi, H.; Chen, Y.; Bull, S. Investigation of the effects of volume fraction, aspect ratio and type of fibres on the mechanical properties of short fibre reinforced 3D printed composite materials. Prog. Addit. Manuf. 2024, 10, 261–277. [Google Scholar] [CrossRef]
  18. PTC Community Forum. Available online: https://community.ptc.com/t5/Vuforia-Studio/Create-Pvi-file-of-Pvz-file/td-p/622700 (accessed on 30 December 2024).
  19. Eversberg, L.; Lambrecht, J. Evaluating Digital Work Instructions with Augmented Reality Versus Paper-Based Documents for Manual, Object-Specific Repair Tasks in a Case Study with Experienced Workers. arXiv 2023, arXiv:2301.07570. [Google Scholar]
  20. Webel, S.; Bockholt, U.; Engelke, T.; Gavish, N.; Olbrich, M.; Preusche, C. An augmented reality training platform for assembly and maintenance skills. Robot. Auton. Syst. 2013, 61, 398–403. [Google Scholar] [CrossRef]
  21. Husár, J.; Knapčíková, L. Implementation of augmented reality in smart engineering manufacturing: Literature review. Mob. Netw. Appl. 2023, 29, 119–132. [Google Scholar] [CrossRef]
Figure 1. Data based on market reports, a statistical analysis of AR/VR spending trends in Europe 2019–2023, and further predictions.
Figure 1. Data based on market reports, a statistical analysis of AR/VR spending trends in Europe 2019–2023, and further predictions.
Jmmp 09 00047 g001
Figure 2. Example of the maintenance sequence creation in PTC Illustrate.
Figure 2. Example of the maintenance sequence creation in PTC Illustrate.
Jmmp 09 00047 g002
Figure 3. Ender 3v2 additive device and 3D printing farm.
Figure 3. Ender 3v2 additive device and 3D printing farm.
Jmmp 09 00047 g003
Figure 4. Modeled hot-end components (details of the print head components: PTFE tube (a), heating element (b), thermistor (c), and connecting components: clamp (d), with (e) representing the screw).
Figure 4. Modeled hot-end components (details of the print head components: PTFE tube (a), heating element (b), thermistor (c), and connecting components: clamp (d), with (e) representing the screw).
Jmmp 09 00047 g004
Figure 5. Example of the created AR Workflow.
Figure 5. Example of the created AR Workflow.
Jmmp 09 00047 g005
Figure 6. AR interface design and preview in Vuforia Studio.
Figure 6. AR interface design and preview in Vuforia Studio.
Jmmp 09 00047 g006
Figure 7. Example of AR sequence tracking.
Figure 7. Example of AR sequence tracking.
Jmmp 09 00047 g007
Figure 8. AR workflow preview in Vuforia Studio.
Figure 8. AR workflow preview in Vuforia Studio.
Jmmp 09 00047 g008
Figure 9. The resulting display of the introduction of the AR sequence (details of the individual parts of the sequence: start of the sequence (a), removal of the print head cover (b), with (c) showing a detailed view of the print head).
Figure 9. The resulting display of the introduction of the AR sequence (details of the individual parts of the sequence: start of the sequence (a), removal of the print head cover (b), with (c) showing a detailed view of the print head).
Jmmp 09 00047 g009
Figure 10. Visualization of an AR-assisted press arm adjustment using HoloLens 2.
Figure 10. Visualization of an AR-assisted press arm adjustment using HoloLens 2.
Jmmp 09 00047 g010
Table 1. Overview of user testing, tasks, metrics, and validation procedures for AR-assisted maintenance.
Table 1. Overview of user testing, tasks, metrics, and validation procedures for AR-assisted maintenance.
AspectDetails
User TestingSubject 1: Used workflows based on ThingMark positional tracking.
Subject 2: Used workflows based on model/object tracking.
Subject 3: Performed the same tasks without AR guidance, using PDF manuals on a PC.
TasksThe sequence consisted of 42 steps, guiding the users through the disassembly of the print head, thermistor replacement, and reassembly.
Metrics for ComparisonTask completion times.
Error rates during assembly and disassembly.
User feedback on workflow usability.
Comparative AnalysisThe time and error differences among the three groups were statistically analyzed.
Material Performance ValidationThe 3D-printed parts were inspected for consistency in mechanical and thermal properties post-maintenance using AR workflows.
Feedback and Usability StudiesParticipants were surveyed for qualitative feedback on the clarity, usability, and overall satisfaction of the AR guidance system.
Table 2. Completion times, error rates, and efficiency metrics across test groups.
Table 2. Completion times, error rates, and efficiency metrics across test groups.
Test SubjectCompletition Time [min]Error Rate [%]EIR [%]ERR [%]OWE [%]
Subject 1 (ThingMark)25537.566.785
Subject 2 (Model tracking)27732.553.385
Subject 3 (Manual)4015---
Table 3. Justification of performance differences between test groups.
Table 3. Justification of performance differences between test groups.
Test SubjectCompletition Time [min]Error Rate [%]Justification
Subject 1 (ThingMark)255High spatial accuracy and seamless alignment with physical components.
Subject 2 (Model tracking)277Moderate tracking stability with occasional misalignments in dynamic contexts.
Subject 3 (Manual)4015Reliance on textural guidance without spatial cues inscreases task complexity.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kaščak, J.; Kočiško, M.; Török, J.; Gabštur, P. Advanced Approaches to Material Processing in FFF 3D Printing: Integration of AR-Guided Maintenance for Optimized Manufacturing. J. Manuf. Mater. Process. 2025, 9, 47. https://doi.org/10.3390/jmmp9020047

AMA Style

Kaščak J, Kočiško M, Török J, Gabštur P. Advanced Approaches to Material Processing in FFF 3D Printing: Integration of AR-Guided Maintenance for Optimized Manufacturing. Journal of Manufacturing and Materials Processing. 2025; 9(2):47. https://doi.org/10.3390/jmmp9020047

Chicago/Turabian Style

Kaščak, Jakub, Marek Kočiško, Jozef Török, and Peter Gabštur. 2025. "Advanced Approaches to Material Processing in FFF 3D Printing: Integration of AR-Guided Maintenance for Optimized Manufacturing" Journal of Manufacturing and Materials Processing 9, no. 2: 47. https://doi.org/10.3390/jmmp9020047

APA Style

Kaščak, J., Kočiško, M., Török, J., & Gabštur, P. (2025). Advanced Approaches to Material Processing in FFF 3D Printing: Integration of AR-Guided Maintenance for Optimized Manufacturing. Journal of Manufacturing and Materials Processing, 9(2), 47. https://doi.org/10.3390/jmmp9020047

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop