Material Science And Metallurgy By Uc Jindal 54.pdf

3 views
Skip to first unread message

Hedy Madrid

unread,
May 5, 2024, 11:32:54 AM5/5/24
to peracace

The key elements of machine learning in materials science. a Schematic view of an example data set, b statement of the learning problem, and c creation of a surrogate prediction model via the fingerprinting and learning steps. N and M are, respectively, the number of training examples and the number of fingerprint (or descriptor or feature) components

Emerging materials informatics tools also offer tremendous potential and new avenues for mining for structure-property-processing linkages from aggregated and curated materials data sets.63 While a large fraction of such efforts in the current literature has considered relatively simple definitions of the material that included mainly the overall chemical composition of the material, Kalidindi and co-workers64,65,66,67 have recently proposed a new materials data science framework known as Materials Knowledge Systems68,69 that explicitly accounts for the complex hierarchical material structure in terms of n-point spatial correlations (also frequently referred to as n-point statistics). Further adopting the n-point statistics as measures to quantify materials microstructure, a flexible computational framework has been developed to customize toolsets to understand structure-property-processing linkages in materials science.70

Material Science And Metallurgy By Uc Jindal 54.pdf


Downloadhttps://t.co/9yzzYz7nYr



The materials science community is just beginning to explore and utilize the plethora of available information theoretic algorithms to mine and learn from data. The usage of an algorithm is driven largely by need, as it should. One such need is to be able to learn and predict vectorial quantities. Examples include functions, such as the electronic or vibrational density of states (which are functions of energy or frequency). Although, the target property in these cases may be viewed as a set of scalar quantities at each energy or frequency (for a given structure) to be learned and predicted independently, it is desirable to learn and predict the entire function simultaneously. This is because the value of the function at a particular energy or frequency is correlated to the function values at other energy or frequency values. Properly learning the function of interest requires machine learning algorithms that can handle vectorial outputs. Such algorithms are indeed available,123,124 and if exploited can lead to prediction schemes of the electronic structure for new configurations of atoms. Another class of examples where vector learning is appropriate includes cases where the target property is truly a vector (e.g., atomic force) or a tensor (e.g., stress). In these cases, the vector or tensor transforms in a particular way as the material itself is transformed, e.g., if it is rotated (in the examples of functions discussed above, the vectors, i.e., the functions, are invariant to any unitary transformation of the material). These truly vectorial or tensorial target property cases will thus have to be handled with care, as has been done recently using vector learning and covariant kernels.102

Another algorithm that is beginning to show value within material science falls under multi-fidelity learning.125 This learning method can be used when a property of interest can be computed at several levels of fidelities, exhibiting a natural hierarchy in both computational cost and accuracy. A good materials science example is the band gap of insulators computed at an inexpensive lower level of theory, e.g., using a semilocal electronic exchange-correlation functional (the low-fidelity value), and the band gap computed using an more accurate, but expensive, approach, e.g., using a hybrid exchange-correlation functional (the high-fidelity value). A naive approach in such a scenario can be to use a low-fidelity property value as a feature in a machine learning model to predict the corresponding higher fidelity value. However, using low-fidelity estimates as features strictly requires the low-fidelity data for all materials for which predictions are to be made using the trained model. This can be particularly challenging and extremely computationally demanding when faced with a combinatorial problem that targets exploring vast chemical and configurational spaces. A multi-fidelity co-kriging framework, on the other hand, can seamlessly combine inputs from two or more levels of fidelities to make accurate predictions of the target property for the highest fidelity. Such an approach, schematically represented in Fig. 6b, requires high-fidelity training data only on a subset of compounds for which low-fidelity training data is available. More importantly, the trained model can make efficient highest-fidelity predictions even in the absence of the low-fidelity data for the prediction set compounds. While multi-fidelity learning is routinely used in several fields to address computationally challenging engineering design problems,125,126 it is only beginning to find applications in materials informatics.42

Perhaps the most important question that plagues new researchers eager to use data-driven methods is whether their problem lends itself to such methods. Needless to say, the existence of past reliable data, or efforts devoted to its generation for at least a subset of the critical cases in a uniform and controlled manner, is a prerequisite for the adoption of machine learning. Even so, the question is the appropriateness of machine learning for the problem at hand. Ideally, data-driven methods should be aimed at (1) properties very difficult or expensive to compute or measure using traditional methods, (2) phenomena that are complex enough (or nondeterministic) that there is no hope for a direct solution based on solving fundamental equations, or (3) phenomena whose governing equations are not (yet) known, providing a rationale for the creation of surrogate models. Such scenarios are replete in the social, cognitive and biological sciences, explaining the pervasive applications of data-driven methods in such domains. Materials science examples ideal for studies using machine learning methods include properties such as the glass transition temperature of polymers, dielectric loss of polycrystalline materials over a wide frequency and temperature range, mechanical strength of composites, failure time of engineering materials (e.g., due to electrical, mechanical or thermal stresses), friction coefficient of materials, etc., all of which involve the inherent complexity of materials, i.e., their polycrystalline or amorphous nature, multi-scale geometric architectures, the presence of defects of various scales and types, and so on.

We acknowledge financial support from several grants from the Office of Naval Research that allowed them to explore many applications of machine learning within materials science, including N00014-14-1-0098, N00014-16-1-2580, and N00014-10-1-0944. Several engaging discussions with Kenny Lipkowitz, Huan Tran, and Venkatesh Botu are gratefully acknowledged. GP acknowledges the Alexander von Humboldt Foundation.

Material Science And Metallurgy is a concise and lucid textbook that covers all the basic concepts involved in metallurgy and material science with illustrations and extra preparatory practise questions.

Material science, or material engineering, is a branch of engineering that applies the various properties of materials to the field of science. It explains the relation between the microscopic structure of a material and its macroscopic features. Metallurgy is a realm of material science that deals with the chemical and physical nature of metals and their alloys, and the different ways science has been incorporated into the production of these metals.

Material Science And Metallurgy has been designed as per the syllabus required to be studied for the leading competitive exams such as the GATE and UPSC. All the basic topics pertaining to material science and metallurgy have been covered in 20 chapters. Topics like Atomic Bonding and Composite Materials, Crystal Structure, Phase Diagrams, Phase Transformations, Ceramic Materials, Semiconductors and Thermal Properties are few of the chapters in the book. The language of the book is simple and comprehensive, and the theory is accompanied by several illustrative diagrams to aid in better understanding of the concepts.

There are more than 500 multiple choice questions and review questions that are included in the book. Solved problems, figures and tables are also included in the book. Material Science And Metallurgy is one of the few books that covers both metallurgy and material science in a compact and concise form. The book can be useful for students of mechanical engineering and civil engineering.

\r \tMaterial Science And Metallurgy is a concise and lucid textbook that covers all the basic concepts involved in metallurgy and material science with illustrations and extra preparatory practise questions.

\r \tMaterial science, or material engineering, is a branch of engineering that applies the various properties of materials to the field of science. It explains the relation between the microscopic structure of a material and its macroscopic features. Metallurgy is a realm of material science that deals with the chemical and physical nature of metals and their alloys, and the different ways science has been incorporated into the production of these metals.

\r \tMaterial Science And Metallurgy has been designed as per the syllabus required to be studied for the leading competitive exams such as the GATE and UPSC. All the basic topics pertaining to material science and metallurgy have been covered in 20 chapters. Topics like Atomic Bonding and Composite Materials, Crystal Structure, Phase Diagrams, Phase Transformations, Ceramic Materials, Semiconductors and Thermal Properties are few of the chapters in the book. The language of the book is simple and comprehensive, and the theory is accompanied by several illustrative diagrams to aid in better understanding of the concepts.

e2b47a7662
Reply all
Reply to author
Forward
0 new messages