More on Cement and other Nano

Just prior to the end of June, there were two articles on cement/concrete.  “How to Make a ‘Smarter’ Cement” [Ref. 1] covers reports from Northwestern University’s team that introduced nanoparticles into ordinary cement.  They investigated two types of nanoparticles and the impact on fracture toughness.  The material employed was graphene in both the shape of rolled tubes (carbon nanotubes) and also as flakes.  Incorporating small amounts of nanomaterial improved the bonding due to tightening the pore structure, which reduced the air voids.  It was pointed out that this structure was replicating nature.  They chose to replicate the multilevel structure with the nanoparticles and the cement.

The second article [Ref. 2] address the variety of application.  This BBC report covers the vast array of cement that is being employed around the world.  The article is interesting in the fact that it includes a vast array of structures both modern and ancient.  The Pantheon in Rome was built in the second century AD and is still the world’s largest unreinforced concrete dome.  This structure’s existence  implies that Roman Cement was superior to what we are currently using in present day construction.  (cf. last month’s blog)

There is more material in published reports on the need for 2D semiconductors.  This is especially true as the dimensional for the 3nm and 2nm semiconductor nodes are investigated.   Reference 3 is one of many such articles.  As the dimensions of the transistor decrease, there needs to be a corresponding reduction in thickness.  Control of the electron flow is critical.  Nanosheets and nanowires are being considered along with the gate-all-around transistor.  The issues arise at the 3nm scale with trapped imperfections in the materials.  2D material appears to provide a possible solution.

From my viewpoint, the issue with 2D material is the production of large quantities of defect free material.  There are various materials being investigated with transition metal dichalcogenides (MoS2, WS2, and WSe2 are most commonly proposed).  The materials of choice are flakes of material exfoliated from bulk material.  (Might bring back some memories of the early carbon nanotube work.)  Atomic Layer Deposition (ALD) can provide think layers but there are residues left behind by the process that can contaminate the desired material.  There is a lot of research being conducted, but much more needs to be accomplished before the 2D process can be ready for producing semiconductors on the scale needed.

From the medical standpoint, graphene and CNTs are being employed in developing sensors.  A team from Harvard [Ref. 4] has created a flexible electrode array that conforms to surfaces of the body.  The example application was described as being applied in difficult to reach places in order to provide recording or stimulation of electrical pulses without the potential of damaging nearby organs.  The team developed the gel and needed to create the required circuitry.  They developed a combination of graphene flakes and CNTs that could provide a continuous conductor path.  When created on the gel, the long CNTs provide a “long” continuous pathway by multiple point contact among numerous CNTs.  They then insulate it on the gel so that only the desired points are exposed.  They have successfully tested the gel in stimulation a mouse heart.  A key point in their development is that they needed to find some means of reducing the metal content required for conductivity.  And, they succeeded.

What we are witnessing is novel thinking in applying different type nanomaterials to solve problems that challenging to resolve with current technology applications.

References:

  1. https://www.techbriefs.com/component/content/article/tb/stories/blog/39416
  2. https://www.bbc.com/future/article/20210628-concrete-the-material-that-defines-our-age
  3. https://semiengineering.com/thinner-channels-with-2d-semiconductors/
  4. https://www.graphene-info.com/graphene-and-cnts-assist-creating-viscoelastic-electrode-arrays

Posted in Uncategorized

Nanomaterials and Cement

It has been an interesting month as the country starts coming back to “normal”.  I was fortunate to have the opportunity to “attend” (virtually) a presentation by a University of Texas at Austin professor.  Professor Hugh Daigle provided a very informative presentation on “Nanotechnology for Upstream Oil and Gas Operations”. [Ref. 1]  Whether you are interested in Oil and Gas or not, there are some very interesting points that are important where your nanotechnology interests are.  The reference is to a YouTube video of the presentation. 

The presentation starts out with establishing the baseline of what is nanomaterial.  The slide at roughly 7:35 of the presentation presents why nanoparticles are so interesting.  This is followed by covering the forces that are important to consider at the small scale.  Most are aware of Electrostatic and van der Waals forces.  The Steric forces are not mentioned as much as the two previous ones.  The description of the application is interesting and has a good explanation of applications. 

Jumping forward to roughly 22:24 in the presentation the explanation of controlling nanoparticle surface charges is important to improving the properties of cement.  At roughly 25:30, he goes into an explanation of why this sizing of the nanoparticles can improve the properties of the cement.

Daigle’s presentation fits into an special area of my interest.  We know that the Roman Coliseum is still standing after 2,000 years.  We can not duplicate their efforts.  Research has been done and there are some interesting findings.  Reference 2 has the explanation that has been acccepted without too much understanding.  Basically, the Roman cement was superior due to using volcanic ash from the regions around the Gulf of Naples.  Also, it is known that the Roman cement cures under water, salt water to be precise.

The work reported in Nature by Alexandra Witze [Ref. 3] has taken this a step further.  A team headed by Marie Jackson discovered there is a very rare mineral formed in the Roman cement. [Ref. 4]  This mineral, aluminum tobermorite comes from a silicate mineral, phillipsite, which is common in volcanic rocks.  Their finding also identified crystals of aluminum tobermorite growing from it.  Apparently the tobermorite is grown from phillipsite by the action of salt water washing through the cement.  This increases the alkalinity of the cement.  It is a rare material.  The tobermorite strengthens the cement due to a plate-like structure that permits flexure not observed in today version of cement. 

Now, I come back to Professor Daigle’s presentation.  He talked about the size of the nanoparticle has a significant influence of the strength of the cement. To me the question I have is did the Romans develop a process that created a consistent size of nanoparticle that yielded the optimum size for the most strong cement?

For the doubters that think the Romans would not have sufficient knowledge/understanding to know that they needed a certain size material (nanoparticle), consider this fact.  In the Middle Ages, the artisans knew they needed a certain size gold nanoparticle to create the red coloring in stain glass windows.  Did they call it gold nanoparticles?  Of course not, but there had to be a process, probably milling, that after a certain time of processing yielded the proper size material to create the red color.  It is possible that the Romans could have had a similar process for the phillipsite.

It does provide some interesting questions about the capability of the Romans and their cement. 

References:

  1. https://youtu.be/cWzzR5h535g
  2. https://newscenter.lbl.gov/2013/06/04/roman-concrete/
  3. https://www.nature.com/news/seawater-is-the-secret-to-long-lasting-roman-concrete-1.22231
  4. https://www.degruyter.com/document/doi/10.2138/am-2017-5993CCBY/html

Posted in Uncategorized

2D Materials are Making Progress

The recent past few months have seen a few published articles on 2D materials and their applications.  The pace is picking up.

One of the challenges of 2D materials is that the materials need to be incorporated into some other material.  A future possibility for semiconductor device transistors requires there must be a means of combining the 2D material with the silicon (or other material) substrate rapidly and in volume.   There are two significant issue with this combination.  One of the issues is that the 2D material and the semiconductor substrate require different temperatures that can damage the material already on the substrate.  The other is that the existing processes are not compatible with high-volume manufacturing.  Researchers in Sweden and Germany [Ref. 1] have developed a process that uses a standard dielectric and available wafer bonding equipment. 

The researchers attached the 2D material, which in on a wafer, and the semiconductor wafer.  Using pressure at room temperature, the two wafers are pressed together, and the temperature raised until the dielectric becomes viscous and allows the 2D material to conform to the surface of the existing material on the semiconductor wafer.  There efforts have been successfully demonstrated on a 100mm silicon wafer with uniform coverage and little strain in the 2D structures.

The development of 2D transistors has also been improving [Ref. 2], Professor Saptarschi Das of Penn State has worked on the viability of transistors from 2D materials.  The purpose of the research is to determine if the 2D transistor can be the future of semiconductors with a significantly reduced footprint.  This also implies the potential for faster speeds.  Reference 3 addresses the promises and prospects of 2D transistors.  It points out that there have been many proof-of-concept demonstrations.  There might be a need to evaluate the 2D transistors to a slightly different criteria to determine the performance of the devices.  This concept of transistor evaluation is also addressed in an Applied Physics Letters [Ref. 4], and it addresses specific parameters that need to be considered. 

Imec researchers are of the opinion that transistors made from 2D materials could become available in the near future [Ref. 5].  While they indicate that silicon will remain the primary source for the microscale electronics, transistors constructed from transition metal dichalcogenides have potential.  The challenge is creating a single atom layer that is a natural semiconductor. 

The need for the change in the transistor construction is due to the every shrinking geometry of the current transistor.  As the structure of the transistor becomes smaller and smaller, the potential for unwanted/spurious leakage of electrons continues to increase.  This is caused by the physical limits of the material property.  There have been design changes in the structure of the transistors, but the limits of material properties and the designs needed to circumvent this issue are continually getting more difficult.    A detailed discussion of the challenges of 2D materials and their application to transistor is available in Reference 6

The challenges of developing the most suitable material, creating an electronic structure that is efficient and fast, along with engineering a manufacturing process that is both high-speed and high-yield is non-trivial.  2D material transistors are coming, but it will take some time before 2D transistors are in main-stream production.  But, the 2D materials are coming.

References:

  1. https://www.techbriefs.com/component/content/article/tb/insiders/electronics-sensors/stories/38954?
  2. https://www.techbriefs.com/component/content/article/tb/insiders/electronics-sensors/stories/38953?
  3. https://www.nature.com/articles/s41586-021-03339-z
  4. https://aip.scitation.org/doi/10.1063/5.0029712
  5. https://spectrum.ieee.org/semiconductors/materials/coming-soon-to-a-processor-near-you-atomthick-transistors
  6. https://pubs.rsc.org/en/content/articlelanding/2015/nr/c5nr01052g#!divAbstract

Posted in Uncategorized

What is the Cost of Ownership?

Recently, the term “Cost of Ownership” has been appearing in various papers about a number of topics, including wind power and electric vehicles among others.  The Wall Street Journal had an article comparing the emissions cost of a gasoline car and an electric one [Ref. 1].  But, what does the Cost of Ownership or COO really mean.

In the late 1970s and the early 1980s, the basic concept of COO was developed to provide a comparison of the generation of electricity by coal versus by nuclear.  As witnessed today in China, coal fired generation plants can be constructed for a few millions of dollars in a short period of a year or less.  Nuclear, on the other hand, takes a significant investment of multiple billions of dollars and a planning and construction process that lasts many years.  If only construction costs are considered, the coal fired approach is much better.  (Environmental impact was not as strongly considered as it is today.)

This concept – COO – was further developed by the semiconductor industry in the 1980s to justify the change from a “silk-screening” type of creating the small patterns for the circuitry to a projection of light through a mask to create the same patterns.  Each wafer (the material used on which the circuits are created) contained multiple individual complete circuits that are separated into individual devices.  The screening process involved tools that cost $10,000 to $20,000.  The optical projection tools were in excess of $100,000.  There was a throughput advantage of the optical system but not by the factor of 5 or more differential in initial price. 

An evaluation of the actual cost of the two different processes was developed.  The process for creating the actual circuitry involves multiple steps of imaging and processing to develop the layers of material that create the circuit functionality.  A difference in yield can be quantified and that is part of the cost of ownership.  The costs include the lifetime of the equipment, the materials actually consumed, operating costs, and the actual yield of the product produced.

As an example, assume the optical method produced one more good wafer equivalent per hour and each wafer was worth $1,000 with each good device worth $50.  (The optical process created more good devices on an equivalent number of wafers compared to the screening process.) If the system ran only 240 hours per month, the extra value of the optical process would be $240,000 per month!  That makes the $100,000 cost of the optical tool was not an issue.  Consequently, the semiconductor industry changed its process. 

In the 1990s and 2000s, there was further refinement in the semiconductor industry to incorporate more variables including repair costs and the resultant value of the loss of product.  This COO process is moving to a total life cycle cost of ownership evaluation.  As the approach is applied to new industries, the inclusion of the initial costs to create the tool/process and the dismantling impact at the end of life for the equipment.  This end-of-life costs are being raised by various people regarding the batteries in general.  The standard lead-acid battery carries a replacement charge when a new battery is installed.  This is to cover the cost of separating out the component materials.  Since there is a finite life of all batteries, one would assume that the batteries projected for storing electricity during non-peak hours will have to be replaced also.

It is not unreasonable to see further developments in costing analysis to start to include the impact of the elimination of various technology products.   This might include the cost of jobs lost.  It would be difficult due to the non-measurable cost of a job and whether it could be replaced by a different set of skills.  COO is a good tool for evaluating alternatives but needs to be applied consistently and with assumptions that can be measured.

References:

  1. https://www.wsj.com/graphics/are-electric-cars-really-better-for-the-environment/

Posted in Uncategorized

It’s time to revisit Nano Technology Safety (Nano-Safety)

Fifteen years ago (2006) there was a white paper published [Ref. 1] that addressed the need to create a structured approach to the handling and usage of nanomaterials.  As stated in the September 2013 blog on Nano-Safety, the primary issues involving nanotechnology is the concern for working with various materials whose properties are different and unknown from the bulk material.

“What is safety with respect to nanotechnology?  In the simplest terms, it is the development, manufacture, application, and control of nanomaterials in a manner that minimizes potential issues, both known and unknown that may impact both people and the environment.  All chemicals can kill one, it is just a matter of quantity.  Dangerous materials, even poisons, can be beneficial if applied in the proper dosage.  Just because something is dangerous does not mean it should not be used.  Fire is dangerous if mishandled, but does that mean we should not use it?

“One concern about safety in handling nanomaterials is based on the unknown issues.  Not knowing the impact on people or the environment can lead to unfounded concerns and a reluctance to accept developments.  This is not only true for nanomaterials but many commonly employed materials/processes when they were first introduced.  If you want to examine issues that have long had proponents and opponents, examine the application of milk pasteurization or the case of adding fluoride to drinking water.  

“In approaching this problem, … the issue of Nano-Safety can be addressed a systematic manner that involves 4 key concepts or pillars, which are: 1) Nanomaterial properties; 2) Impact on people and the environment; 3) Handling of nanomaterials; and, 4) Business focus.” [Ref. 2] 

The application of nanotechnology to health issues has seen significant developments along with related issues.  Gold nanoparticles have been employed to attach to cancer cells and then be irradiated by IR wavelength that pass harmlessly through skin but heat the gold and destroy the cancer cell that the particle is attached to.  Concerns can arise about the accumulation of the gold in the body.    

Fortunately, the tools available for investigation of the nanomaterials has been improving along with a better understanding of the interaction of nanomaterials with the human body.  

Research is approaching an interesting time.  Work is being done on nanosensors that can be connected to create a sense of “feeling” in prosthetics.  Graphene, of which research has been conducted, is now finding applications is material composite, where each layer is a single atom thick.  We are not close to large scale manufacturing – yet, but applications are being developed that promise to be able to reproduce some concepts of human feeling capabilities.  There is some work being done that could mitigate eye damage like Macular Degeneration. 

One issue that needs to be addressed is the impact of the new composites on people and the environment.  The process for long term evaluation and governmental approval is long.  This is especially true when considered to the half-life of startup companies. 

There is a need for a reevaluation of the existing guidelines for Nano-Safety and to create an update that is directed at nanotechnology and the medical environment.  The ability to expedite the development and application of nanotechnology in medical situation could provide a benefit for many.  This could be a 5th element of Nano-Safety.  More on this topic will be in future blogs.

References:

  1. Available at http://www.tryb.org/a_white_paper_on_nano-safety.pdf
  2. September 2013 blog http://www.nano-blog.com/?m=201309

Posted in Uncategorized

Let’s start 2021 out with a question.

In today’s technical reports, there are more references to Artificial Intelligence performing analyses of materials for developing an understanding of the properties of various combinations of materials at the nano-level structure.  Single layer materials are being combined with other single layer materials to produce various changes in the material properties that are not observed in the bulk realm.  As computer programs increase in complexity and capability, more and more materials are being evaluated for “interesting” characteristics. 

First consider the computers and the programs.  Prior to the mid-1980s, computers were large and relative slow.  Memory was expensive.  Expert Systems that were developed could not access large quantities of information, so the systems needed to have a table to reference its data.  The systems I built in the early 1980s used a matrix, like today’s spreadsheets, to reference probabilities of occurrences.  The system was designed to guide the worker in more quickly identifying the source of an issue.  After that specific source was identified, the worker entered the data, and the system updated the probabilities of occurrences.  An interesting observation was that employing this particular system at two different and not linked facilities provided different probabilities of occurrences at the different locations.  [References 1 & 2 are background on Expert Systems and Artificial Intelligence.  Wikipedia is also a good starting point.) 

It must be noted that there was a specialized computer developed based on MIT Artificial Intelligence that was identified as a LISP machine and had 36-bit operating system.  Interesting story [Ref. 3]

Today’s Artificial Intelligence programs have the ability to draw on a much larger database, can self-extract information from published materials, and can be programmed to continually run evaluations based on directed changes in the events/materials being selected.  While the field has grown in the Twenty-first century, there are some questions arising about the inability to prevent biases from being incorporated into the programming.  (Possibly something like the situation described about expert systems?) 

A saying that has been around for almost as long as computers is GIGO.  Garbage In, Garbage out!  The quality of the data is critical to the resultant output.  This takes us back to the application of Artificial Intelligence to evaluating novel materials.  The quality of the calculated material properties will be only as good as the data that the calculations are based upon.  There is no question that the working materials can be characterized fairly exactly.  The question is: “Do we know the material properties of absolutely pure materials?

Semiconductors are created by introducing a very small number of impurity atoms to change the characteristics of the base material.  The mathematics and the experimentation that provided the data is proven.  It works on the bulk scale.  But, what happens as the total amount of material under modification becomes smaller and smaller.  This is a serious question that needs to be answered.

In working with materials, the purity is specified.  99.99% pure is quite pure.  Six 9s is even better.  That is one part impurity in one million.  If one uses an approximation that there are 7 atoms of gold in one cubic nanometer, then there are 7 million gold atoms in a 100nm cubed sample, which implies 7 atoms that are not gold.  How does that change the properties of the material?  How do we purify gold or any other material to remove every extraneous atom?  Do we know the material properties of absolutely pure material?

References:

  1. https://azati.ai/the-return-of-expert-systems/
  2. https://www.parascript.com/blog/machine-learning-ai-vs-expert-systems-ai/
  3. https://en.wikipedia.org/wiki/Artificial_intelligence

Posted in Uncategorized

Developments as the Year ends

As technology moves more into the nano realm, interesting items are turning up.  Researchers at the University of Manchester, UK have found quasiparticles in a lattice of boron nitride with graphene in the middle and another layer of boron nitride [Ref. 1].  The definition of a quasiparticle is rather complex and can be described as a “phenomenon that occurs when a microscopically complicated system such as a solid behaves as if it contained different weakly interacting particles in vacuum” [Ref. 2].  If the comment is “so what”, the answer is interesting.  These phenomena provide the opportunity to create electronic properties that can be developed without needing to use chemical doping.  New type of semiconductors?

Gold particles have been employed to destroy cancerous cells due to the ability of the particles to rapidly heat when exposed to infrared radiation.  Attaching the nanoparticle to a particle that will be gathered by the cancer, enables the nano gold particles to accumulate at the cancerous site and be destroyed via IR heating.  Researchers at University of Cambridge and University of Leeds [Ref. 3] found that gold nanotubes can enter mesothelioma cells and subsequently destroy the cancerous cells whet the nanotubes are subjected to IR radiation.  The advantages of the nanotubes are that they can be tuned to absorb light at specific wavelengths.  There are two near-IR wavelengths that can easily penetrate skin tissue.  The tuning is accomplished by modifying the thickness of the nanotubes.  Another item in favor of this approach is that the nanotubes are created with similar dimensions to asbestos fibers.  So, the gold nanotubes should attach in similar places where the cancer and asbestos fibers are.

Nanyang Technological University in Singapore developed a green friendly, magnetically-activated glue technology [Ref. 4].  Typical operations in the fashion wear industry employ heat to activate glue that holds materials, like sport shoe components together.  Since the temperatures can range from 20 to 80C, it requires significant energy.  The researchers have added magnetic nanoparticles to adhesive material.  The magnetic field activates the adhesive and creates a bond as strong as obtained through heating.  This process uses much less energy and considering the size of the industry, could potentially save significant energy requirements. 

Japanese researchers have developed an ultra-thin sensor consisting of multilayers of conductive and dielectric nanomesh structures.  These structures when employed as an artificial skin patch does not affect the touch sensitivity beneath the patch [Ref. 5].  The challenge for creating the patch that transmits the sense of feeling is that there is a high density of sensors even over areas a small as 50 microns.  The patch must transmit the sense of touch while being flexible to follow the underlying skin structures.  The sensor developed consists of two porous layers.  The first layer is insulating and mesh-like while being in the range of 200 to 400nm thick.  The second layer is a capacitance network of lines.  The change in capacitance when the area experiences pressure provides the signal that translates into touch.

Wishing everyone a safe and prosperous New Year with great possibilities for interesting applications of nanotechnology.

References:

  1. https://physicsworld.com/a/new-family-of-quasiparticles-appears-in-graphene/?
  2. Dictionary.com  Quasiparticle
  3. https://physicsworld.com/a/gold-nanotubes-and-infrared-light-could-treat-asbestos-related-cancer/
  4. https://semiengineering.com/manufacturing-bits-dec-23-3/
  5. https://physicsworld.com/a/nanomesh-pressure-sensor-preserves-skins-sense-of-touch  

Posted in Uncategorized

Technology Roadmaps update & more unusual nano properties

In the June 2018 blog both the International Technology Roadmap for Semiconductors and the current International Roadmap for Devices and Systems were mentioned in a discussion of the need for roadmaps [Ref. 1].   As the semiconductor industry moves into the smaller single digit nanometers structures there are increasing challenges.  If one assumes that there needs to be a tolerance on 10% deviation from the straightness of the lines (it’s actually less), something new comes into play.    If one has a 3nm dimension, a 10% tolerance would imply a 0.3nm deviation.   The issue that will arise is that the dimensions of the molecules of material used to form the lines is roughly 0.5nm.  To achieve the 0.3nm tolerance would require being able to construct the arrangement of the molecules in such a manner that there would be no misalignment.  Two issues with this.  The first is that we do not have the technology to create such an exacting structure.  The second, and more important, is that we are unable to measure to this accuracy with the precision and speed required for manufacturing. 

In so research that was done at the University of Texas at Austin, there are some unusual properties observed.  A rotation of two layers of graphene by 1.12 degrees causes electrons to behave in strange ways.  [Ref. 2]  Electrons start moving 100 times more slowly than previously.  These strange properties of layered 2-D materials appears to result from interactions.  Part of this effort is directed at developing an understanding of high temperature superconductivity.  This concept was first identified theoretically and dismissed as not a real occurrence.  Due to the difficulty in developing and measuring an experiment, it was not expected to be demonstrated.  The researchers at the University of Texas developed an experimental configuration that permitted the evaluating an observation of the strange properties. 

Additional work has been published on using gold nanotubes and IR light to possible treat asbestos related cancer.  Work done at the University of Cambridge and the University of Leeds demonstrated that gold nanotubes tuned to have strong IR absorption and enter mesothelioma cells.  Heating the gold particles destroys the cancer.  Work on gold nanoparticles and IR heating for a method of destroying cancer cells has been ongoing since the early 2000s.  The application of the nanotubes permits the penetration of the mesothelioma cells.  So the attachment challenge can be solved.  IR radiation and gold nanoparticles is one method of destroying the cancer cells that has been proven.  With the ability to attach to the cancer provides the potential for a means of eliminating these specific cells. 

The last item in this month’s blog is not necessarily nano but intriguing.  Researchers at Simon Fraser University in Canada have created an experiment to demonstrate that hot water actually does freeze faster than cold water [Ref. 4].  While the actual experiment did not create the frozen water, the experiment demonstrated the process of molecule activity that leads to the conclusion that the warmer water has more molecular activity that causes the distribution of the warmer liquid to move more rapidly and distribute the temperature over a larger area more quickly than cold water. 

References:

  1. June 2018 blog http://www.nano-blog.com/?m=201806
  2. https://phys.org/news/2019-10-physics-magicangle-graphene-switchable.html
  3. https://physicsworld.com/a/gold-nanotubes-and-infrared-light-could-treat-asbestos-related-cancer/?utm_medium=email&utm_source=iop&utm_term=&utm_campaign=14258-48087&utm_content=Title%3A%20Gold%20nanotubes%20and%20infrared%20light%20could%20treat%20asbestos-related%20cancer%20%20-%20research_update&Campaign+Owner=
  4. https://physicsworld.com/a/experiments-pin-down-conditions-that-make-hot-water-freeze-before-cold/?utm_medium=email&utm_source=iop&utm_term=&utm_campaign=14290-46927&utm_content=Title%3A%20Experiments%20pin%20down%20conditions%20that%20make%20hot%20water%20freeze%20before%20cold%20%20-%20Editors_pick&Campaign+Owner=

Posted in Uncategorized

Nanotechnology gets even stranger

Nanotechnology is interesting.  When one starts to think that we have a reasonable knowledge of particle behavior, someone finds something new and interesting.  A basic assumption in thermodynamics is that objects change temperature (warming up or cooling down) at the same rate as a function of the environment of the particles.  Researchers at the Max Planck Institute of Biophysical Chemistry have predicted temperature change asymmetry based on mathematical models of confined nanoparticles [Ref. 1].  This modeling effort predicts that the motion of the warmer particles ends to bring them more quickly into the center of the probability distribution.  The researchers think that their results will improve the understanding of temperature change in nanoscale systems and provide insights into the Mpemba effect.  This phenomenon is the effect that warmer particles more quickly when its starting temperature is warmer.  They hope to confirm the theoretical results thought physical experiments. 

Researchers at the University of Arkansas have developed a graphene-based circuit, which they claim can produce clean power.  Their work appears to contradict Feynman’s theory that Brownian motion can not perform work.  The researchers contend that micron sized sheets of freestanding graphene move in a manner that is conducive to energy harvesting.    Their lab tests have indicated that freestanding sheets of graphene can generate an alternating current.  The researchers content that the thermal movement in the graphene is inherent in the material and not a result of temperature differential. 

Suhas Kumar of HP Labs, R. Stanley Williams at Texas A&M, and Ziwen Wang at Stanford have developed an electronic device that functions like a neuron.  The device combines resistance, capacitance, and Mott memristance. The most crucial part is the nanometers-thin niobium oxide (NbO2) layer.  (Note: Memristors are devices that hold memory based on the resistance of the current that has flowed through them. Mott memristors add the ability to incorporate any temperature-driven change in resistance.)  The structure of the material layers requires a high degree of precision.  The researchers developed the circuit through lengthy trial and error.

Most materials get thinner when they are stretched.  (A rubber band is a good example where it gets thinner as the length is elongated.)  Auxertic materials are different.  HELEN Gleeson, University of Leeds, has led research on auxertic materials that can be defined as material that also expands in one of the directions perpendicular to the elongation.  While these materials were originally formed in the 1970-80s, research into the development of synthetic auxetics ahs not been highly investigated.  The materials occur naturally in complex biomaterials.  Examples include human tendons.  Inorganic auxetics include copper, gold, and other face centric cubic materials.  When these materials are stretched, they undergo an internal reorganization, which forms voids that lowers the overall density.  The thoughts on potential applications include automotive windshields.  When an impact can cause a delamination between the various layers, incorporation of an auxertic could create an expansion where the base material undergoing an elongations and thinning.  This would increase the strength of the initial product.

As always, the world of nanotechnology provides unexpected insights into he property of materials at the nanorealm.

References:

  1. https://physicsworld.com/a/nanoparticles-warm-up-faster-than-they-cool-down/
  2. https://www.upi.com/Science_News/2020/10/02/Graphene-based-circuit-yields-clean-limitless-power/1571601661030/
  3. https://spectrum.ieee.org/nanoclast/semiconductors/devices/memristor-first-single-device-to-act-like-a-neuron?
  4. https://physicsworld.com/a/new-auxetic-material-stretches-the-limits/?

Posted in Uncategorized

Below nano gets more interesting

Last blog mentioned metallic hydrogen at a very small level.  As the computational power increases, we are able to examine (theoretically at least) the behavior of various materials.  There is a significant amount of effort being directed at the investigation of a new state of matter of metals, defined as “strange metals”, which actually involves understanding the behavior of the electrons.

The difference from traditional understanding of metals, is that “strange” metals have electrical resistance directly linked to its temperature.  The conductivity is linked to both Planck’s constant and Boltzmann’s constant.  [Ref. 1]  Planck’s constant establishes the energy a photon can have, while the Boltzmann constant is the relationship of the kinetic energy of particles in a gas with temperature of the gas. 

According to Reference 2, a robust computational model of “strange’ metals provided sufficient details to classify these “strange” metals existing in a new state of matter.  Their explanation of “strange” metals is that the name is generated base on the behavior of electrons in the metal.  In metals, electrons travel freely in the material with little resistance and few interactions.  “Strange” metal electrons are more restricted and slower moving that would be anticipated.  In effect, “strange” metals are not metal nor insulator.  The referenced article also employs the term “reluctant” metals.

This discovery is a result of research on high-temperature super-conductivity [Ref. 3]  In 1990, researchers discovered that cuprates have a strange behavior that does not vary with temperature as anticipated or as other high temperature super-conductors.  Current theory can explain superconducting  properties below 30 K, but that property up to temperatures of 130 K  was puzzling.  Fermi liquid theory predicts that at low temperatures, the metal resistance should depend on the square of the temperature.  Cuprates resistance varies linearly down to when they become superconducting.  This testing has been performed at a wide range of temperatures and with field strengths of up to 80 T (tesla).

Some research at TU Wien (Vienna University of Technology) is focused on developing higher temperature super conducting materials [Ref. 4].  They have been using ytterbium, rhodium, and silicon (YbRn2Si2), which is know for its “strange” properties.  They are using a new molecular beam epitaxy (MBE) process.  They build the layers of the material atomic layer by atomic layer.  (A side issue was to create the substrate for building the material.  Germanium turns out to be a geometrical match for the structure of the new material.  They have found that a sudden change in the carrier concentration induces the “strange” metal state.  Significant additional work needs to be done to understand the state of the materials before any development can proceed.

The implications of this work are many.  While the mention of super conductivity has been basis for projections of high speed transport via magnetic levitation, the increase in conductivity could increase the effective amount of electrical power for everyone.  Currently, the transmissions losses are significant and being able to almost eliminate the losses would effectively increase the electrical power available.  This type of application becoming mainstream is based on atomic level precision and methods of building materials layer by atomic layer.   That ability to create very large amounts of the material is still a few years in the future.  Tools need to be developed and techniques for measuring and evaluating the resultant materials need to be developed.  This is part of the continual search for new material properties starting at the atomic and nanometer sizes.   

References:

  1. https://newatlas.com/physics/strange-metals-new-state-of-matter/
  2. https://physicsworld.com/a/reluctant-metals-make-a-new-state-of-matter/
  3. https://physicsworld.com/a/strange-metals-become-even-stranger/
  4. https://www.eurekalert.org/pub_releases/2020-01/vuot-anl011620.php

Posted in Uncategorized