h a l f b a k e r yCaution! Contents may be not!
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
Amidst all the excitement over carbon nanotubes and
graphene, it is easy to forget that most engineering happens
with metals, and almost always with alloys. There are
probably about 50 elements that might be used in alloys (the
non-radioactive metals plus a few non-metals like
phosphorus
and
carbon), and each might be present in quantities from a
few ppm up to 99.99%. This means that there are trillions of
possible alloys, even if you limit yourself to only half a
dozen
ingredients, and use cheap(ish) metals as the main
ingredients.
There are also umpteen different factors that will affect the
properties of any given alloy, such as rate of cooling, various
heat treatments, and hot or cold working. This gives you
many
more effectively different alloys.
I would guess that perhaps a few tens to a few hundreds of
thousand alloys have been studied over the centuries, and
many useful ones have been discovered. However, this
leaves
trillions more to be tried. Attempts have been made to
predict alloy properties ab initio, but these attempts
generally
fail, or rely on empirical observations of a small range of
alloys. Real alloys often behave completely differently from
the predictions.
As a biologist, I am used to "fishing trips" involving vast
combinatorial libraries. You want an aptamer that will bind
X?
Well, you just start with a library of a few billion aptamers
and find the ones that work. Similar approaches are used to
create novel antibodies and even enzymes.
Materials scientists, however, don't seem to have gone in for
this approach. Therefore, MaxCo. proposes developing the
Alloyotron - a system for searching vast numbers of alloys
for
useful properties.
The Alloyotron systematically creates small (say, 0.1 gram)
batches consisting of powdered elements, with a huge
diversity of compositions. Each little mixture is sintered
into a
pellet using a combination of heat and pressure. Each little
sintered pellet, in turn, is held aloft in a jet of inert gas for
a
few seconds whilst lasers converge on it and melt whatever
it
contains. It's held there, molten, for a few moments to
allow
thorough mixing (though some mixtures will partition), and
then cooled either rapidly (by another gas jet shooting it
into a
cold bath) or slowly (by gradually reducing the laser power).
So, the machine is producing a succession of alloy beads, of
varied composition and varied cooling rate. Conceivably,
they
could also be subjected to various additional heat profiles.
These beads then drop into a machine that measures
properties
including hardness, density, thermal conductivity, ductility
(by
squashing them flat and imaging the resultant shape - more
ductile alloys will produce discs; less ductile ones will make
flower shapes), Young's modulus and (tricky but doable)
tensile
strength. Finally, each alloy's melting point (or yield-point)
is
measured.
All of the above happens automatically, perhaps at the rate
of
one sample per second. Most of the alloys will be useless,
but
a tiny percentage of the 30 million alloys analysed during a
year will warrant further analysis, and of these a handful
will
prove to be useful and worth further development and
tweaking.
[link]
|
|
Let us know when you start feeding your machine with the Actinide series ... |
|
|
//the non-radioactive metals plus a few non-metals// |
|
|
You'd want a larger bar or rod shape for testing tensile strength, maybe a few cm per side. |
|
|
A small electric furnace with several crucibles/molds might be more practical than the lasers and gas jets, but not as theatrical. Most heat treatment processes take hours, and some take days, so for one sample per second you'd need a massively parallel operation. |
|
|
//You'd want a larger bar or rod shape for testing tensile
strength// Now, you see, that's one of the problems with
materials science. Ask them to test a sample, and they'll
want something big. If they were any good, they'd invent
machines for testing the tensile strength of smaller
samples. |
|
|
Unless the sample has a large grain size (admittedly an issue
for some alloys), there is no earthly reason why you can't
measure the tensile strength of a 1mm, or even 100µm
sample. The shape is an issue, but not an intractable one.
At a pinch (no pun intended) you can measure shear
strength instead, by holding the spherical sample between
two plates with almost-a-hemisphere dimples in them and
applying the shear force. Shear strength is probably as good
an initial measurement as tensile strength. |
|
|
Re. the crucibles - yes, but gets mechanically complex;
things react with crucibles or stick to them. |
|
|
Re. the cooling rate - agreed, but testing 0.1 vs 1 vs 10s
cooling would be a good start. We're not looking to cover
every possible alloy/treatment combination - just a better
sampling of the trillions of options. |
|
|
[edit: the annos above appeared while I was writing this] |
|
|
Such a machine would indeed be useful if it could be built. But, there are many more variables than you mention, and the idea that samples could be tested at a rate of once a second is outrageous. |
|
|
For example, even the oldest types of tool steel are usually subject to a two stage heat-treatment of hardening and tempering. Newer tool steels come in oil- and air-hardening varieties, the heat treatment process taking substantially longer than one second. Many grades of high speed steel have heat treatments that take several hours. |
|
|
We know that the properties of steel depend on its crystal structure, but the laser sintering process may make the results entirely unhelpful. It may even produce amorphous steel, with no grain boundaries at all, despite the ingredients leading to a particular structure when mixed in a furnace. |
|
|
The corrosion-resistant properties of wrought iron are due, in part, to a macro grain structure that would be absent from a 0.1g bead. |
|
|
Many alloys behave differently and interestingly after work-hardening, such as extrusions or drawn wire, which would again be difficult to test in the machine you describe. Some alloys are precipitation-hardening - the earliest usable aluminium alloys apparently took several days to reach their final strength. |
|
|
All in all, the brute-force approach to metallurgy seems ambitious, and any variables that can be blindly permuted with ease are probably already exhausted. If you want to create new alloys, the best tool is probably still a microscope. |
|
|
//the idea that samples could be tested at a rate of once a
second is
outrageous.// It's not outrageous at all. That's the mindset
that has
left (metallic) materials science back in the 1950s. In
contrast,
biologists have figured out how to sequence DNA molecules
millions
(literally) of times faster and ~10,000x cheaper over the
last 30 years. |
|
|
//heat treatments that take several hours.// Yes, and we
won't be
able to examine alloy structures that require that (unless,
of course,
you send a procession of samples, one per second, along a
zoned
furnace, similar to the baking of bread). |
|
|
//laser sintering process// I'm not laser sintering. I'm
sintering first to
make a pellet, then melting the pellet with lasers. |
|
|
//macro grain structure// Yes, agreed, large-grain-
dependent
structures won't be accessible. |
|
|
//work-hardening// Yes, agreed, you could not really do
work-
hardening (well, you could, quite easily, if these damned
metallurgists
would stop thinking in terms of anvils and hundred-ton
rollers). |
|
|
//any variables that can be blindly permuted with ease are
probably
already exhausted// So, to take the example of just mixing
any 5 of
the 50 or so relevant elements; and with say 5 different
percentages
for each of the elements, there are about 8 x 10^11
possibilities. I do
not believe for one instant that any significant fraction of
that
repertoire has been explored. The Alloyotron will only
scratch the
surface, but will be orders of magnitude faster than
conventional
metallurgists. |
|
|
I'm not suggesting for an instant that a large-scale analysis
like this will
be efficient: many of the possible combinations would be
rejected by
a sensible metallurgist before even being made. But I am
suggesting
that: |
|
|
(a) if metallurgists weren't so hidebound, they could
produce a
machine for massively parallel (or very fast serial) testing of
millions of
combinations |
|
|
(b) a useful (though far from exhaustive) subset of
conditions can be
tested |
|
|
(c) interesting samples would form the starting point of the
detailed
analysis and optimisation by real metallurgists. |
|
|
// If you want to create new alloys, the best tool is
probably still a
microscope.// No. Microscopes are excellent tools for
understanding
alloys that you have made, thereby suggesting ways to
improve them.
I know, because this is my daughter's field of expertise.
She has spent
her PhD doing very detailed and sophisticated analysis of
essentially
one alloy. It's good work. But you can't examine - not with
an optical
microscope, a TEM, an SEM or an atom probe - an alloy that
you
haven't made. |
|
|
Incidentally, it's worth re-emphasizing that prediction of
alloy
structures from first principles is, at present, not possible
to any
significant degree. |
|
|
I notice a tendency toward using this for steel. A bad Idea
I think. Given the volume and efforts that have gone into
steel alloys, I doubt there are any low hanging fruit there.
There's already a flavor of steel for practically
everything. Even then, unless weight is a major issue, it's
often easier to just spec 2x the mild steel rather than
4340 for example. |
|
|
Where you might find some joy is titanium and other
slightly exotic alloys. At the moment, the development of
steel is so mature that there are steels with better
strength/weight than titanium alloys... so there's likely
more opportunity in fiddling around in there. |
|
|
As the properties of most alloying additives are
pretty-well understood, skipping broadly down
combinations and leaving a gaps to interpolate
seems a wise course. |
|
|
// I doubt there are any low hanging fruit there.//
Possibly, but possibly not. The properties of most iron
alloys were not deduced a priori, and often vary widely for
small changes in the amounts of the other components. It's
quite likely that some useful steel alloys remain to be
discovered. |
|
|
//titanium and other slightly exotic alloys.// Quite so. |
|
|
//the properties of most alloying additives are pretty-well
understood// No, they're not. For *common* alloying
elements used one or two at a time with *common* main
components, the properties of the alloys are known and, to
a very limited extent, "understood". For anything outside
the current spectrum of alloys, the effect of alloying
elements is not really understood and certainly not
predictable in most cases. |
|
|
Hardly. Most alloys I deal with have at least 6 or 7,
admittedly some of them in trace amounts. |
|
|
We know what these elements do by themselves
when they are subject to the temperatures
required. Unless they are chemically combined
they will behave similarly in an alloy matrix trial. |
|
|
(Disclaimer: I failed Engineering Materials at university, so
take my comments with a pinch of salt. Or pepper. Or
something...)
When it comes to work-hardening, tempering, heat-
treatments, etc, etc, exactly WHAT happens at the atom
and grain level is the question. Once that is understood
(some-one correct me if it already is...) this system is a
go.
In theory, this "small scale" testing should work if you
have (at least) 2 grains (to measure grain-boundary
properties). Every other property should be able to be
"pre-made" in the sample from the outset; being small
means things can happen faster during manufacture (no
waiting for heat to propagate, for example). The samples
(at least some of them) will have to be larger to
accommodate larger-grain alloys, but not so big that it
can't still be a "high speed" process. |
|
|
That's what I figured. I'm not claiming that this approach will replace good
old-fashioned metallurgy. It will give erroneous results for some alloys, and
will fail to measure some key properties or to achieve the best structure in
some cases. But sheer weight of numbers mean you'll get a few interesting
and unpredictable leads. |
|
|
It's the same in molecular biological "fishing trips" that produce massive
datasets (such as environmental sequencing). The data is generally not as
good as you'd get from bespoke experiments; you miss a lot. But give me
30Gb of moderately trashy sequence data and I'll guarantee to find some
interesting things in it, and then those are the things you follow up on. |
|
|
I think a key question here is whether alloy properties are scale invariant or not. |
|
|
Curiously, in biology the answer is yes, no, both and neither - in both
directions - a mouse's haemoglobin is exactly the same as that of an elephant's
- I'd venture the exact function is performed in both - despite the scaling
differences. Meanwhile, assuming that bone constituents are the same in mice
and elephants (is that a fair assumption?), it's clear that a mouse's femur has
very different operational parameters to those of an elephant's. |
|
|
Yes, but the material properties of mouse bone are very
similar to those of elephant bone. I think you're confusing
intrinsic properties with extrinsic properties; a length of 2x4
is stronger than a pencil, even though they are made of
more or less the same material. |
|
|
Almost by definition, the properties of an alloy will be
independent of the size of the sample, as long as the
sample is large enough to contain a representative number
of crystals. |
|
|
There _are_ some things that you can't really measure on
small samples; for example, critical crack length is often
centimetres or more. But you can measure a lot of other
properties (and, to a reasonable approximation, derive the
CCL from them). |
|
|
Also you should measure how brittleness and resistivity change as a result of heat work. We use Kanthal heating elements which have to be thrown away after about 200 uses.
How the alloy is cooled will be a huge variable in this. For example the alloy that turbine blades are made out of is pretty special, but its properties are vastly more conducive to your aeroplane not crashing in a fiery inferno if it is cooled and pulled out of the melt in the right way so that the turbine blade is a single crystal and hence has very little 'creep' when heated. This is an example where scale may matter: i.e. it's easier to measure 'creep' on a large scale, and also harder (but concomitantly more useful) to get a big thing to be a single crystal than a small thing. |
|
|
[hippo] now there's a coincidence. My No.1 daughter also
works on jet engine alloys (currently for the front end, but
she did some work on high-temperature back-end stuff). |
|
|
I would have thought that single-crystal applications were
relatively rare. But, I take your point. As I mentioned, this
isn't supposed to be a panacea for alloy discovery - just a
relatively fast way to trawl through a lot of compositions
very fast with the likelihood of finding at least a few
interesting things. |
|
|
And regarding your Kanthal elements - if the sample were
small enough, it must shirley be possible to regulate its
temperature with a pulsed laser: deliver the heat in short
enough pulses that there's no huge thermal gradient (across
the very small sample), and measure the temperature by IR
emission between the pulses, no? |
|
|
I saw a thing on TV about making single crystal parts.
As I recall, it involved a mould with a spiral path. After
pouring, the assembly cooled from the top, and
although at first crystalisation was chaotic, individual
crystals would compete with each other down the
spiral, so only one made it through to the part proper.
Very clever. |
|
|
Since we're making comparisons to molecular biology, I
think it might be worth trying to either utilise
some order in the screen (the 'dilution series' or 'gradient
PCR' approach), or conversely do things at very
small scale completely at random (the 'shotgun library'
approach). |
|
|
In the first case, you might prepare stock batches of two
or three different molten alloys, then combine
them in varying proportions. This allows you to prep a
large number of samples varying in one or two ways
relatively efficiently, and is probably much easier to e.g.
get good mixing of ingredients even with some
constituents at low
concentration. |
|
|
In the second case, you might create a significant number
of different sub-samples in the same form factor
(e.g. all mostly iron, but each with, say, additional 20%
base metal or 2% exotic), thoroughly mix them all
and then combine into random groups of 5. Then for the
interesting ones you'd need to work out what you
had by post-trial analysis. |
|
|
[Loris] Ah yes, that sounds familiar - I must have watched the same
thing as you |
|
|
Is this a good project for a deep learning algorithm. Given the number of current alloy data points, the computation may spot a pattern unseen by human logic. This S.A.D. machine would be the data sensor to the algorithm's investigation. |
|
|
One problem is cooking times. It may miss alloys that need a millennia at a certain temperature or condition for the unique properties. |
|
|
That's already been acknowledged, multiple times. |
|
|
Machine learning might be profitably applicable, but I
wouldn't expect it to be easy. |
|
|
But I want to build this machine now. |
|
|
Gr. Millennium (singular) |
|
|
So, I asked my daughter (who is a metallurgist) about this.
She agreed it would be possible and that you'd probably find
some interesting alloys. |
|
|
However. The main problem she identified is that different
research groups work on specific applications, with very
little crossover. So, if your group is working on, say,
turbine-blade alloys they are not going to be equipped for
(or be interested in) following up on new alloys that are,
say, super-hard or remarkably ductile. |
|
|
There's also a regulatory aspect, at least in aerospace. New
companies can spend decades earning the credibility to
develop new alloys of a particular type and getting them
certified for use. So, if a mass-screening program came up
with a completely new alloy, it would not be commercially
viable for them to develop it as far as aerospace
applications (or, presumably, other safety-critical
applications). |
|
|
Those "problems" make me want to do this even more. |
|
| |