Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 3180 2023-09-07 13:06:53 |
2 format correct Meta information modification 3180 2023-09-08 02:42:05 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Badini, S.; Regondi, S.; Pugliese, R. AI Algorithms and Material Informatics Tools. Encyclopedia. Available online: https://encyclopedia.pub/entry/48921 (accessed on 19 May 2024).
Badini S, Regondi S, Pugliese R. AI Algorithms and Material Informatics Tools. Encyclopedia. Available at: https://encyclopedia.pub/entry/48921. Accessed May 19, 2024.
Badini, Silvia, Stefano Regondi, Raffaele Pugliese. "AI Algorithms and Material Informatics Tools" Encyclopedia, https://encyclopedia.pub/entry/48921 (accessed May 19, 2024).
Badini, S., Regondi, S., & Pugliese, R. (2023, September 07). AI Algorithms and Material Informatics Tools. In Encyclopedia. https://encyclopedia.pub/entry/48921
Badini, Silvia, et al. "AI Algorithms and Material Informatics Tools." Encyclopedia. Web. 07 September, 2023.
AI Algorithms and Material Informatics Tools
Edit

The integration of artificial intelligence (AI) algorithms in materials design is revolutionizing the field of materials engineering thanks to their power to predict material properties, design de novo materials with enhanced features, and discover new mechanisms beyond intuition. In addition, they can be used to infer complex design principles and identify high-quality candidates more rapidly than trial-and-error experimentation.

artificial intelligence machine learning materials design materials informatics

1. Introduction

The rapid advancements in artificial intelligence (AI) and machine learning (ML) hold immense potential for revolutionizing and expediting the arduous and costly process of materials development. In recent decades, AI and ML have ushered in a new era for materials science by leveraging computer algorithms to aid in exploration, understanding, experimentation, modeling, and simulation [1][2]. Working alongside human creativity and ingenuity, these algorithms contribute to the discovery and refinement of novel materials for future technologies.

2. AI Algorithms and ML Models

AI and ML have revolutionized the way we approach problem-solving and decision-making [3][4][5]. ML, in particular, is a field that focuses on the development and deployment of AI algorithms capable of analyzing data and its properties to determine actions without explicit programming [6].
Unlike traditional programming, ML algorithms leverage statistical tools to process data and learn from it, allowing them to improve and adapt dynamically as more data becomes available. This concept of “learning” forms the foundation of ML, enabling algorithms to make predictions, recognize patterns, and make informed decisions.
ML algorithms can be broadly categorized into three main types, each serving different purposes: supervised learning, unsupervised learning, and reinforcement learning. These diverse categories of ML algorithms provide a powerful toolkit for solving complex problems, optimizing processes, and extracting valuable insights from data. As illustrated in Figure 1, they form a comprehensive framework that enables AI systems to analyze and interpret data, facilitating intelligent decision-making and automation across various domains.
Figure 1. (A) A schematic representation of the materials design workflow using AI, consisting of three essential elements: a material dataset, machine learning models capable of learning and interpreting representations for specific tasks using the provided dataset, and output that yields optimized and enhanced material properties for the creation of advanced materials. Reproduced with permission from Ref. [7]. CC BY 4.0 (B) An overview of machine learning methodologies highlighting the three primary categories: supervised learning, unsupervised learning, and reinforcement learning.
Supervised learning involves training algorithms using labeled data, where inputs and desired outputs are provided to teach the algorithm how to make accurate predictions. Hence, this learning process is based on comparing the calculated output and predicted output, that is, learning refers to computing the error and adjusting the error for achieving the expected output. Examples of such algorithms include linear regression (LIR) [8], support vector regression (SVR) [9], feedforward neural networks (FFNNs) [10], and convolutional neural networks (CNNs) [11].
In addition to classical supervised ML algorithms such as LIR, SVR, or random forests (RFs), which are useful for predicting mechanical features of materials [12], researchers have developed artificial neural networks (ANNs) inspired by the interconnected neurons in the human brain to delve into deep data mining [13]. Among these networks, FFNNs, or multilayer perceptrons (MLPs), have emerged as quintessential and relatively simple models. FFNNs are extensively used in ML and deep learning, featuring multiple layers of interconnected nodes or neurons arranged sequentially. These networks facilitate data flow in a unidirectional manner, from the input layer to the output layer, without any loops or feedback connections.
Each layer, comprising multiple neurons, calculates outputs for the subsequent layer based on inputs received from the preceding layer. The weights or trainable parameters associated with each neuron are optimized to minimize the loss function, allowing the FFNN to learn complex patterns and relationships in the data.
In the field of materials design, FFNNs can be effectively employed in various ways [14][15][16]. For instance, by providing the composition, processing conditions, and microstructure of a material as inputs, an FFNN can learn the intricate relationship between these factors and properties such as mechanical strength, thermal conductivity, or electrical resistivity. This enables the optimization of material compositions and processing parameters to achieve desired properties. Furthermore, FFNNs can be combined with optimization algorithms to explore and optimize material designs [17][18]. By treating the FFNN as a surrogate model that approximates the relationship between input variables (e.g., material composition, processing conditions) and desired performance metrics, optimization algorithms can efficiently search the design space to identify optimal material configurations. This application proves particularly valuable when seeking materials with specific properties or performance targets.
However, it is important to note that successful utilization of FFNNs in materials design relies on the availability of high-quality training data, careful feature selection, and a thorough understanding of the model’s limitations and assumptions. Additionally, domain expertise and experimental validation remain crucial in interpreting and verifying the FFNN’s predictions and recommendations.
Besides FFNNs, CNN architectures are gaining widespread attention due to their applications in computer vision and natural language processing (NLP) [19]. They serve as a specialized type of ANN, specifically designed for analyzing visual data such as images or videos. CNNs possess the ability to automatically learn hierarchical representations of visual patterns and features directly from raw input data. The fundamental operation in CNNs is convolution, which preserves the spatial relationship between pixels. It involves multiplying the image matrix with a filter matrix, where the filter contains trainable weights that are optimized during the training process for effective feature extraction. By employing various filters, CNNs can perform distinct operations such as edge detection on an image. Through the stacking of convolutional layers, simple features gradually combine to form more complex and comprehensive ones.
CNNs have revolutionized computer vision tasks, encompassing image classification, object detection, segmentation, and more [19]. Their hierarchical structure, parameter sharing, and spatial invariance properties contribute to their efficacy in learning and extracting meaningful features from visual data. Consequently, CNNs have found widespread adoption in numerous domains, including medical imaging, facial recognition, and image-based recommender systems [20][21].
In the field of material design problems, CNNs hold great potential. With their capability to capture features at different hierarchical levels, CNNs are well-suited for describing the properties of materials, which inherently possess hierarchical structures, particularly in the case of biomaterials [22].
Overall, CNNs offer a powerful toolset for extracting relevant information from visual data, enabling breakthroughs in a wide range of applications and fostering advancements in fields such as materials engineering and design.
Unsupervised learning, on the other hand, deals with unlabeled data, where the algorithm identifies patterns and structures within the data without explicit guidance. Intriguing and successful categories of unsupervised architectures are generative adversarial networks (GANs), which consist of two neural networks, the generator and the discriminator [23]. GANs are designed to learn and generate synthetic data that resembles a target dataset, without the need for explicit labels or supervision. The generator network is responsible for generating synthetic data samples. It takes as input random noise or a latent vector and produces data that resembles the target dataset. The generator network usually consists of one or more layers of neural nodes, often using deconvolutional layers to up-sample the noise into a larger output. The discriminator network acts as a binary classifier, distinguishing between real data samples from the target dataset and synthetic data samples generated by the generator network. The discriminator network aims to correctly classify whether a given sample is real or fake. It is trained with labeled data, where real samples are labeled as “real” and synthetic samples as “fake”. GANs have gained significant attention in the field of ML and have shown impressive capabilities in generating realistic and diverse data, including images, text, and audio [24]. They have also shown potential in unsupervised learning tasks, where the generated data can be used for downstream tasks such as clustering, representation learning, and semi-supervised learning. It’s important to note that GANs require careful tuning, hyperparameter selection, and large amounts of training data to achieve optimal performance. Additionally, evaluation metrics for GANs are an ongoing area of research, as assessing the quality and diversity of generated samples can be subjective.
In a notable study, Mao et al. [25] introduced a GAN-based approach for designing complex architectured materials with extraordinary properties, such as materials achieving the Hashin-Shtrikman upper bounds on isotropic elasticity. This method involves training neural networks using simulation data from millions of randomly generated architectured materials categorized into different crystallographic symmetries. The advantage of this approach lies in its ability to provide an experience-free and systematic framework that does not require prior knowledge and can be readily applied in diverse applications.
The significance of this methodology extends beyond the design of metamaterials. By leveraging simulation data and ML, it offers a novel and promising avenue for addressing various inverse design problems in materials and structures. This approach opens up exciting possibilities for tackling complex design challenges and exploring new frontiers in materials science and engineering. The work by Mao and colleagues not only contributes a practical and systematic method for designing materials with desired properties but also highlights the potential of combining simulation data and ML in materials research. By harnessing the power of generative adversarial networks, this approach enables the exploration of vast design spaces and paves the way for innovative advancements in materials design and engineering.
Lastly, reinforcement learning focuses on training algorithms through interactions with an environment, where the algorithms learn by receiving feedback and rewards based on their actions. Among this class of ML algorithms, AlphaFold must surely be mentioned. It is a reinforcement learning-based AI system developed by DeepMind that has demonstrated exceptional capabilities in predicting protein structures [26]. It utilizes deep learning algorithms to accurately predict the 3D structure of proteins, which is a challenging and crucial task in the field of biochemistry and molecular biology. While initially focused on protein folding, the underlying principles and techniques of AlphaFold have the potential for broader applications, including materials problems such as interactive materials design.
Besides this outstanding example, graph neural networks (GNNs) belong to this class of ML algorithms. GNNs are a type of ANN designed to process and analyze graph-structured data [27]. Graphs are mathematical structures that consist of nodes connected by edges. GNNs are specifically developed to capture and model the relationships and dependencies between nodes in a graph. GNNs operate on graph-structured data, which can represent various real-world systems and relationships. Each node in the graph represents an entity, while the edges denote the connections or relationships between entities. Using GNNs, Guo and Buehler [28] have developed an approach to design architected materials. The GNN model is integrated with a design algorithm to engineer the topological structures of the architected materials. The authors reported that such sensing method is applicable to design problems of truss-like structures under complex loading condition in additive manufacturing (e.g., 3D printing), architectural design, and civil infrastructure applications, and has the potential to be closely integrated with IoT methods and autonomous sensing and actuation approaches.
ML algorithms used in materials design, along with example applications, are summarized in Table 1.
Table 1. Machine learning algorithms used in materials design and optimization.

3. Materials Informatics Methodologies

Besides the development of AI and ML models for processing and analyzing massive amounts of data, discovering patterns, and making predictions, materials informatics (MI)—a multidisciplinary field that act as a junction between materials science, data science, and AI—has the capabilities to unlock the potential of vast material database management, thus accelerating materials design and development. By integrating MI tools with AI and ML algorithms (namely Hybrid AI), researchers can draw on the multitude of available data and extract valuable insights that were previously unreachable, revolutionizing the way of engineering materials in several fields of application [37][38][39].
The framework of MI mainly consists of three parts: (1) data acquisition, (2) data representation, and (3) data mining (or data analysis) [40].
Data acquisition involves obtaining physical and structural properties through simulations or experiments. Data representation focuses on selecting descriptors that capture the essential characteristics of materials within a dataset. Lastly, data mining aims to identify relationships between structural information and desired material properties [40].
MI methodologies are various and tailored to address specific design challenges. The choice of methodology depends on the nature of the problem and the objectives of the research. Among such methods, the most used for materials design are integrating data modalities, physical-based deep learning, materiomics, and computer vision methodologies. An overview of the main methodologies employed in MI along with their key features is reported in Table 2.
Integrating data modalities are powerful data acquisition tools. Transformer models, in particular, provide a robust framework for combining multiple sources of multimedia data, such as text, images, videos, and graphs, thereby expanding and enhancing datasets. This integration capability has empowered researchers, exemplified by the work of Hsu et al. [41], to design sustainable materials derived from biocompatible resources with greater efficacy. By leveraging the rich information contained in various data formats, transformer models facilitate the optimization of mechanical properties, starting from the microstructure level [42]. This integration of data modalities unleashes the potential for comprehensive materials analysis and design, opening up new avenues for advancing the field of sustainable materials. To permeate collected data with significance, physical principles are integrated into deep learning techniques, resulting in efficient simulations. A deep learning method has been used to predict high-fidelity and high-resolution images for stress fields near cracks considering material microstructures [43]. Additionally, physics-informed neural networks have been utilized to derive data-driven solutions for nonlinear partial differential equations, smoothing the modeling of dynamic problems [44]. Furthermore, as reported by Lai and co-workers [45], a data-driven regression model demonstrated the correlation between the crystalline structure and luminescence characteristics of Europium-doped phosphors, enabling the prediction of emission wavelengths.
To achieve superior material properties starting from the design phase, it is fundamental to understand the intricate interplay between the physical, chemical, and topological properties of matter. In this content, materiomics employs analytically driven, simulation-driven, and data-driven procedures to predict complex behaviors by breaking down materials into their hierarchical building blocks. This approach is particularly valuable for bioinspired material design, as it considers relevant scales and draws inspiration from the well-organized structures found in nature [46]. Furthermore, materiomics offers a promising avenue to ensure the environmental sustainability of manufactured structures, as it considers the life cycle and ecological impact of materials [47]. Moreover, the use of AI enables the resolution of inverse design problems in order to develop material compositions and structures that fulfill a specific set of target requirements. These requirements often involve challenging requirements, such as enhancing mechanical performance or efficiency while simultaneously reducing weight and cost [48][49][50][51].
To enhance the interpretability of results in materials engineering, computer vision methodologies, such as graphic rendering and virtual reality, can be implemented. Indeed, Yang et al. [52] employed an AI-based approach by using molecular dynamics simulations to realize the structure and property quantification of 3D graphene foams with mathematically regulated topologies. In another study the authors, using a limited set of known data and a multiple deep learning architecture, demonstrated the capability to predict missing mechanical information and further analyze intricate 2D and 3D microstructures [53]. Furthermore, the training of specific algorithms can provide extra information on mechanical features that can optimize the material design process and lead scientists to new discoveries [54][55][56].
Another data mining tool is represented by the possibility of transfer learning and fine-tuning the algorithms. This methodology considers the adaptation of pre-existing models to address problems that differ from the original one, thereby altering the characteristics of the input data or the reward value [57]. For instance, Jiang et al. [58] used a transfer learning algorithm that solved dynamic multi-objective optimization problems to generate an effective initial population pool via reusing past experience to speed up the design process.
Currently, the utilization of large language models (LLMs), such as Chat-GPT, LLaMa, and Bard, is generating significant interest due to the profound impact this technology can have on human life [59][60]; moreover, its application in material analysis can be considered as a valuable instrument for intelligent material design and prototyping [61]. One notable advantage is the ability to fine-tune LLMs for specific tasks using a relatively small amount of labeled data, which can even be extracted from the published literature [62]. This approach eliminates the need to train a new model from scratch. Indeed, the final layers of the neural network can be substituted to adapt the AI parameters, and then the entire model can be trained in significantly less time than training it from scratch would take. These adaptations enable LLMs to be effectively applied in various domains, including dataset mining, molecular modeling, microstructure generation, and material structure extraction [63][64].
Lastly, another powerful tool in materials research is the autonomous discovery of materials using AI. By harnessing the capabilities of automated experimentation systems, laboratories empower AI to autonomously explore the extensive design space and make informed decisions about which experiments to conduct. This approach revolutionizes the traditional trial-and-error approach to materials discovery by leveraging AI algorithms and ML techniques. The AI system can analyze vast amounts of data, including experimental results, material properties, and synthesis conditions, to identify patterns, correlations, and novel material candidates [65][66]. Through this autonomous exploration, complex new materials with unprecedented characteristics are unveiled [67]. An interesting development in this field was presented by Nikolaev et al. [68] who established the first autonomous experimentation (AE) system for materials development. Initially, the AE system was trained to grow carbon nanotubes with precise growth rates by applying a six-dimensional processing parameter gained after obtaining a deeper understanding of the underlying phenomena. Through an iterative research process spanning 600 autonomous iterations, the AE system successfully identified the optimal growth conditions to achieve the desired growth rate.
Overall, to further enhance the materials discovery process, the coupling of MI tools with AI algorithms demonstrates vitality. This combination deploys the possibility to renovate and transform research methodologies. By leveraging digital strategies, researchers can overcome traditional challenges encountered in materials design, thus boosting the discovery and development of novel materials.
Table 2. Key methodologies in materials informatics.
Materials Informatics (MI) Tools
Type Main Features References
Integrating data modalities Integration of different multimedia sources into datasets (text, images, videos, and graph data) [41][42]
Physics-based deep learning Integration of physics models into deep learning settings [43][44][45]
Materiomics Usage of analytically driven, simulation-driven, and data-driven methods to break down materials into their essential building blocks [46][47]
Inverse design Solving inverse design problems [48][49][50][51]
Computer vision methodologies Combination of graphic rendering, virtual reality, and interpretable machine learning [52][53]
Transfer learning and Fine-tuning Adaptation of pre-existing algorithms to a different problem resolution [57][58]
Large Language Models (LLMs) Elaboration of various natures of datasets to predict and generate text and other forms of content [61][62][63][64]
Autonomous materials discovery Autonomous exploration of design space with self-directed decisions regarding experimentation and tests [65][66][67][68]

References

  1. Pyzer-Knapp, E.O.; Pitera, J.W.; Staar, P.W.J.; Takeda, S.; Laino, T.; Sanders, D.P.; Sexton, J.; Smith, J.R.; Curioni, A. Accelerating materials discovery using artificial intelligence, high performance computing and robotics. Npj Comput. Mater. 2022, 8, 84.
  2. Li, J.; Lim, K.; Yang, H.; Ren, Z.; Raghavan, S.; Chen, P.-O.; Buonassisi, T.; Wang, X. AI Applications through the Whole Life Cycle of Material Discovery. Matter 2020, 3, 393–432.
  3. Pugliese, R.; Regondi, S.; Marini, R. Machine learning-based approach: Global trends, research directions, and regulatory standpoints. J. Inf. Technol. Data Manag. 2021, 4, 19–29.
  4. Sarker, I.H. Machine Learning: Algorithms, Real-World Applications and Research Directions. SN Comput. Sci. 2021, 2, 160.
  5. Koteluk, O.; Wartecki, A.; Mazurek, S.; Kołodziejczak, I.; Mackiewicz, A. How Do Machines Learn? Artificial Intelligence as a New Era in Medicine. J. Pers. Med. 2021, 11, 32.
  6. Das, S.; Dey, A.; Pal, A.; Roy, N. Applications of Artificial Intelligence in Machine Learning: Review and Prospect. Int. J. Comput. Appl. 2015, 115, 31–41.
  7. Frydrych, K.; Karimi, K.; Pecelerowicz, M.; Alvarez, R.; Dominguez-Gutiérrez, F.J.; Rovaris, F.; Papanikolaou, S. Materials Informatics for Mechanical Deformation: A Review of Applications and Challenges. Materials 2021, 14, 5764.
  8. Montgomery, D.C.; Peck, E.A.; Vining, G.G. Introduction to Linear Regression Analysis; John Wiley & Sons: Hoboken, NJ, USA, 2012; p. 821.
  9. Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297.
  10. Hadjiprocopis, A.; Smith, P. Feed Forward Neural Network Entities; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 1997; p. 1240.
  11. Indolia, S.; Goswami, A.K.; Mishra, S.; Asopa, P. Conceptual Understanding of Convolutional Neural Network—A Deep Learning Approach. Procedia Comput. Sci. 2018, 132, 679–688.
  12. Guo, K.; Yang, Z.; Yu, C.-H.; Buehler, M.J. Artificial intelligence and machine learning in design of mechanical materials. Mater. Horiz. 2021, 8, 1153–1172.
  13. Shawky, A.; El-Bhrawy, M.; Mohamed, N.E.K. Artificial Neural Networks in Data Mining. Int. J. Sci. Eng. Res. 2016, 7, 158–161.
  14. McConaghy, T.; Gielen, G. Analysis of Simulation-Driven Numerical Performance Modeling Techniques for Application to Analog Circuit Optimization. In Proceedings of the IEEE International Symposium on Circuits and Systems, Kobe, Japan, 23–26 May 2005; p. 1298.
  15. Zaunseder, E.; Müller, L.; Blankenburg, S. High Accuracy Forecasting with Limited Input Data: Using FFNNs to Predict Offshore Wind Power Generation. In Proceedings of the 9th International Symposium on Information and Communication Technology, Danang City, Vietnam, 6–7 December 2018; pp. 61–68.
  16. Akhlaghi, Y.; Kompany-Zareh, M. Comparing radial basis function and feed-forward neural networks assisted by linear discriminant or principal component analysis for simultaneous spectrophotometric quantification of mercury and copper. Anal. Chim. Acta 2005, 537, 331–338.
  17. Hartl, R.; Praehofer, B.; Zaeh, M. Prediction of the surface quality of friction stir welds by the analysis of process data using Artificial Neural Networks. Proc. Inst. Mech. Eng. Part L J. Mater. Des. Appl. 2020, 234, 732–751.
  18. Sagi, M.; Vu Doan, N.A.; Fasfous, N.; Wild, T.; Herkersdorf, A. Fine-Grained Power Modeling of Multicore Processors Using FFNNs. Int. J. Parallel Program. 2022, 50, 243–266.
  19. Bhatt, D.; Patel, C.; Talsania, H.; Patel, J.; Vaghela, R.; Pandya, S.; Modi, K.; Ghayvat, H. CNN Variants for Computer Vision: History, Architecture, Application, Challenges and Future Scope. Electronics 2021, 10, 2470.
  20. Buehler, M.J. Liquified protein vibrations, classification and cross-paradigm de novo image generation using deep neural networks. Nano Futures 2020, 4, 035004.
  21. Franjou, S.L.; Milazzo, M.; Yu, C.-H.; Buehler, M.J. Sounds interesting: Can sonification help us design new proteins? Expert Rev. Proteom. 2019, 16, 875–879.
  22. Xue, K.; Wang, F.; Suwardi, A.; Han, M.-Y.; Teo, P.; Wang, P.; Wang, S.; Ye, E.; Li, Z.; Loh, X.J. Biomaterials by design: Harnessing data for future development. Mater. Today Bio 2021, 12, 100165.
  23. Yang, Z.; Yu, C.-H.; Buehler, M.J. Deep learning model to predict complex stress and strain fields in hierarchical composites. Sci. Adv. 2021, 7, eabd7416.
  24. Shahriar, S. GAN computers generate arts? A survey on visual arts, music, and literary text generation using generative adversarial network. Displays 2022, 73, 102237.
  25. Mao, Y.; He, Q.; Zhao, X. Designing complex architectured materials with generative adversarial networks. Sci. Adv. 2020, 6, eaaz4169.
  26. Jumper, J.; Evans, R.; Pritzel, A.; Green, T.; Figurnov, M.; Ronneberger, O.; Tunyasuvunakool, K.; Bates, R.; Žídek, A.; Potapenko, A.; et al. Highly accurate protein structure prediction with AlphaFold. Nature 2021, 596, 583–589.
  27. Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The Graph Neural Network Model. IEEE Trans. Neural Netw. 2009, 20, 61–80.
  28. Guo, K.; Buehler, M.J. A semi-supervised approach to architected materials design using graph neural networks. Extreme Mech. Lett. 2020, 41, 101029.
  29. Yang, K.; Xu, X.; Yang, B.; Cook, B.; Ramos, H.; Krishnan, N.M.A.; Smedskjaer, M.M.; Hoover, C.; Bauchy, M. Predicting the Young’s Modulus of Silicate Glasses using High-Throughput Molecular Dynamics Simulations and Machine Learning. Sci. Rep. 2019, 9, 8739.
  30. Zhao, Q.; Yang, H.; Liu, J.; Zhou, H.; Wang, H.; Yang, W. Machine learning-assisted discovery of strong and conductive Cu alloys: Data mining from discarded experiments and physical features. Mater. Des. 2021, 197, 109248.
  31. Tehrani, A.M.; Oliynyk, A.O.; Parry, M.; Rizvi, Z.; Couper, S.; Lin, F.; Miyagi, L.; Sparks, T.D.; Brgoch, J. Machine Learning Directed Search for Ultraincompressible, Superhard Materials. J. Am. Chem. Soc. 2018, 140, 9844–9853.
  32. Oh, S.; Jung, Y.; Kim, S.; Lee, I.; Kang, N. Deep Generative Design: Integration of Topology Optimization and Generative Models. J. Mech. Des. 2019, 141, 111405.
  33. Ravinder, R.; Sridhara, K.H.; Bishnoi, S.; Grover, H.S.; Bauchy, M.; Jayadeva; Kodamana, H.; Krishnan, N.M.A. Deep learning aided rational design of oxide glasses. Mater. Horiz. 2020, 7, 1819–1827.
  34. Ni, B.; Gao, H. A deep learning approach to the inverse problem of modulus identification in elasticity. MRS Bull. 2020, 46, 19–25.
  35. Hsu, Y.-C.; Yu, C.-H.; Buehler, M.J. Using Deep Learning to Predict Fracture Patterns in Crystalline Solids. Matter 2020, 3, 197–211.
  36. Bessa, M.A.; Glowacki, P.; Houlder, M. Bayesian Machine Learning in Metamaterial Design: Fragile Becomes Supercompressible. Adv. Mater. 2019, 31, e1904845.
  37. Moud, A.A. Recent advances in utility of artificial intelligence towards multiscale colloidal based materials design. Colloid Interface Sci. Commun. 2022, 47, 100595.
  38. Zahrt, A.F.; Henle, J.J.; Rose, B.T.; Wang, Y.; Darrow, W.T.; Denmark, S.E. Prediction of higher-selectivity catalysts by computer-driven workflow and machine learning. Science 2019, 363, eaau5631.
  39. Bai, B.; Han, X.; Zheng, Q.; Jia, L.; Zhang, C.; Yang, W. Composition optimization of high strength and ductility ODS alloy based on machine learning. Fusion Eng. Des. 2020, 161, 111939.
  40. Wan, X.; Feng, W.; Wang, Y.; Wang, H.; Zhang, X.; Deng, C.; Yang, N. Materials Discovery and Properties Prediction in Thermal Transport via Materials Informatics: A Mini Review. Nano Lett. 2019, 19, 3387–3395.
  41. Hsu, Y.-C.; Yang, Z.; Buehler, M.J. Generative design, manufacturing, and molecular modeling of 3D architected materials based on natural language input. APL Mater. 2022, 10, 041107.
  42. Shen, S.C.-Y.; Buehler, M.J. Nature-inspired architected materials using unsupervised deep learning. Commun. Eng. 2022, 1, 37.
  43. Buehler, M.J. Predicting mechanical fields near cracks using a progressive transformer diffusion model and exploration of generalization capacity. J. Mater. Res. 2023, 38, 1317–1331.
  44. Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 2019, 378, 686–707.
  45. Lai, S.; Zhao, M.; Qiao, J.; Molokeev, M.S.; Xia, Z. Data-Driven Photoluminescence Tuning in Eu2+-Doped Phosphors. J. Phys. Chem. Lett. 2020, 11, 5680–5685.
  46. Buehler, M.J.; Cranford, S. Materiomics: Biological protein materials, from nano to macro. Nanotechnol. Sci. Appl. 2010, 3, 127–148.
  47. Shen, S.C.; Khare, E.; Lee, N.A.; Saad, M.K.; Kaplan, D.L.; Buehler, M.J. Computational Design and Manufacturing of Sustainable Materials through First-Principles and Materiomics. Chem. Rev. 2023, 123, 2242–2275.
  48. Lew, A.J.; Buehler, M.J. Single-shot forward and inverse hierarchical architected materials design for nonlinear mechanical properties using an Attention-Diffusion model. Mater. Today 2023, 64, 10–20.
  49. Chen, C.-T.; Gu, G.X. Generative Deep Neural Networks for Inverse Materials Design Using Backpropagation and Active Learning. Adv. Sci. 2020, 7, 1902607.
  50. Kalidindi, S.R. Feature engineering of material structure for AI-based materials knowledge systems. J. Appl. Phys. 2020, 128, 041103.
  51. Wang, N.; Chang, H.; Zhang, D. Deep-Learning-Based Inverse Modeling Approaches: A Subsurface Flow Example. J. Geophys. Res. Solid Earth 2020, 126, e2020JB020549.
  52. Yang, Z.; Buehler, M.J. High-Throughput Generation of 3D Graphene Metamaterials and Property Quantification Using Machine Learning. Small Methods 2022, 6, e2200537.
  53. Yang, Z.; Buehler, M.J. Fill in the Blank: Transferrable Deep Learning Approaches to Recover Missing Physical Field Information. Adv. Mater. 2023, 35, e2301449.
  54. Shen, S.C.Y.; Fernández, M.P.; Tozzi, G.; Buehler, M.J. Deep learning approach to assess damage mechanics of bone tissue. J. Mech. Behav. Biomed. Mater. 2021, 123, 104761.
  55. Olfatbakhsh, T.; Milani, A.S. A highly interpretable materials informatics approach for predicting microstructure-property relationship in fabric composites. A highly interpretable materials informatics approach for predicting microstructure-property relationship in fabric. Compos. Sci. Technol. 2022, 217, 109080.
  56. Oaki, Y.; Igarashi, Y. Materials Informatics for 2D Materials Combined with Sparse Modeling and Chemical Perspective: Toward Small-Data-Driven Chemistry and Materials Science. Bull. Chem. Soc. Jpn. 2021, 94, 2410–2422.
  57. Jiang, M.; Huang, Z.; Qiu, L.; Huang, W.; Yen, G.G. Transfer Learning-Based Dynamic Multiobjective Optimization Algorithms. IEEE Trans. Evol. Comput. 2018, 22, 501–514.
  58. Xu, D.; Luo, Y.; Luo, J.; Pu, M.; Zhang, Y.; Ha, Y.; Luo, X. Efficient design of a dielectric metasurface with transfer learning and genetic algorithm. Opt. Mater. Express 2021, 11, 1852.
  59. Kasneci, E.; Seßler, K.; Küchemann, S.; Bannert, M.; Dementieva, D.; Fischer, F.; Gasser, U.; Groh, G.; Günnemann, S.; Hüllermeier, E.; et al. ChatGPT for good? On opportunities and challenges of large language models for education. Learn. Individ. Differ. 2023, 103, 102274.
  60. Eloundou, T.; Manning, S.; Mishkin, P.; Rock, D. Gpts are gpts: An early look at the labor market impact potential of large language models. arXiv 2023, arXiv:2303.10130.
  61. Badini, S.; Regondi, S.; Frontoni, E.; Pugliese, R. Assessing the capabilities of ChatGPT to improve additive manufacturing troubleshooting. Adv. Ind. Eng. Polym. Res. 2023, 6, 278–287.
  62. Tshitoyan, V.; Dagdelen, J.; Weston, L.; Dunn, A.; Rong, Z.; Kononova, O.; Persson, K.A.; Ceder, G.; Jain, A. Unsupervised word embeddings capture latent knowledge from materials science literature. Nature 2019, 571, 95–98.
  63. Hu, Y.; Buehler, M.J. Deep language models for interpretative and predictive materials science. APL Mach. Learn. 2023, 1, 010901.
  64. Beltagy, I.; Lo, K.; Cohan, A. SciBERT: A Pretrained Language Model for Scientific Text; Association for Computational Linguistics: Hong Kong, China, 2019; pp. 3606–3611.
  65. Bukkapatnam, S.T. Autonomous materials discovery and manufacturing (AMDM): A review and perspectives. IISE Trans. 2022, 55, 75–93.
  66. Stach, E.; DeCost, B.; Kusne, A.G.; Hattrick-Simpers, J.; Brown, K.A.; Reyes, K.G.; Schrier, J.; Billinge, S.; Buonassisi, T.; Foster, I.; et al. Autonomous experimentation systems for materials development: A community perspective. Matter 2021, 4, 2702–2726.
  67. Lee, N.A.; Shen, S.C.; Buehler, M.J. An automated biomateriomics platform for sustainable programmable materials discovery. Matter 2022, 5, 3597–3613.
  68. Nikolaev, P.; Hooper, D.; Webber, F.; Rao, R.; Decker, K.; Krein, M.; Poleski, J.; Barto, R.; Maruyama, B. Autonomy in materials research: A case study in carbon nanotube growth. Npj Comput. Mater. 2016, 2, 16031.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , ,
View Times: 167
Revisions: 2 times (View History)
Update Date: 08 Sep 2023
1000/1000