In today’s technical reports, there are more references to Artificial Intelligence performing analyses of materials for developing an understanding of the properties of various combinations of materials at the nano-level structure. Single layer materials are being combined with other single layer materials to produce various changes in the material properties that are not observed in the bulk realm. As computer programs increase in complexity and capability, more and more materials are being evaluated for “interesting” characteristics.
First consider the computers and the programs. Prior to the mid-1980s, computers were large and relative slow. Memory was expensive. Expert Systems that were developed could not access large quantities of information, so the systems needed to have a table to reference its data. The systems I built in the early 1980s used a matrix, like today’s spreadsheets, to reference probabilities of occurrences. The system was designed to guide the worker in more quickly identifying the source of an issue. After that specific source was identified, the worker entered the data, and the system updated the probabilities of occurrences. An interesting observation was that employing this particular system at two different and not linked facilities provided different probabilities of occurrences at the different locations. [References 1 & 2 are background on Expert Systems and Artificial Intelligence. Wikipedia is also a good starting point.)
It must be noted that there was a specialized computer developed based on MIT Artificial Intelligence that was identified as a LISP machine and had 36-bit operating system. Interesting story [Ref. 3]
Today’s Artificial Intelligence programs have the ability to draw on a much larger database, can self-extract information from published materials, and can be programmed to continually run evaluations based on directed changes in the events/materials being selected. While the field has grown in the Twenty-first century, there are some questions arising about the inability to prevent biases from being incorporated into the programming. (Possibly something like the situation described about expert systems?)
A saying that has been around for almost as long as computers is GIGO. Garbage In, Garbage out! The quality of the data is critical to the resultant output. This takes us back to the application of Artificial Intelligence to evaluating novel materials. The quality of the calculated material properties will be only as good as the data that the calculations are based upon. There is no question that the working materials can be characterized fairly exactly. The question is: “Do we know the material properties of absolutely pure materials?”
Semiconductors are created by introducing a very small number of impurity atoms to change the characteristics of the base material. The mathematics and the experimentation that provided the data is proven. It works on the bulk scale. But, what happens as the total amount of material under modification becomes smaller and smaller. This is a serious question that needs to be answered.
In working with materials, the purity is specified. 99.99% pure is quite pure. Six 9s is even better. That is one part impurity in one million. If one uses an approximation that there are 7 atoms of gold in one cubic nanometer, then there are 7 million gold atoms in a 100nm cubed sample, which implies 7 atoms that are not gold. How does that change the properties of the material? How do we purify gold or any other material to remove every extraneous atom? Do we know the material properties of absolutely pure material?