• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

National Ignition Facility - Big News?

Ssider

Student
Joined
Feb 19, 2008
Messages
33
If anyone else started a thread on this I apologize, but I would like to read opinions from forum members on the following article:

http://optics.org/news/3/1/37

If the NIF is claiming a "burn" within the next 18 months then this would be the biggest news in the energy world since Fermi made the first chain reaction, or so I would think.

If I read this right the NIF is not claiming just a controlled fusion reaction (which we've seen demonstrated at a net loss of energy many times over the last few decades) but a break even/gain. After all, the NIF is not some fringe individual who is claiming some exotic LENR.

Any plasma physicists care to elaborate? Is the NIF likely to achieve their lofty goal or do you all think that they are going to fall flat.

thanks
 
If I read this right the NIF is not claiming just a controlled fusion reaction (which we've seen demonstrated at a net loss of energy many times over the last few decades) but a break even/gain. After all, the NIF is not some fringe individual who is claiming some exotic LENR.

Controlled fusion above the break-even point has been achieved many times in tokamaks. That's not the problem. The problem is sustaining it beyond a few moments, and doing so in an economically viable way.
 
Controlled fusion above the break-even point has been achieved many times in tokamaks. That's not the problem. The problem is sustaining it beyond a few moments, and doing so in an economically viable way.

How did I miss that news? That, in itself is a big deal. Everything I read said that we had not yet gotten more out than we put into making a fusion reaction.

http://www.gizmag.com/break-even-nuclear-fusion-reactions-possible-within-three-years/16944/

As of Nov. 2010 breakeven was still elusive, or so says everything I've read.
 
I remember it -- not the exact dates, but the events.

1. They got out as much energy as they put in
2. They got out more than they put in

But it's like putting a little gas in a chamber and exploding it -- it's still a long way to a reciprocating piston in a cylinder driving 450 horse power.
 
How did I miss that news? That, in itself is a big deal. Everything I read said that we had not yet gotten more out than we put into making a fusion reaction.

http://www.gizmag.com/break-even-nuclear-fusion-reactions-possible-within-three-years/16944/

As of Nov. 2010 breakeven was still elusive, or so says everything I've read.

That's not the case. But again, the problem with tokamaks is that you can't control the plasma very well, and inevitably it drifts and runs into the walls - at which point you have to shut down and fix the hole it made. That's what makes it impractical (and I suppose in that sense it's not breakeven). I don't see how NIF is much different.
 
How many times a second can it fire?

How many times a day can it fire?

Can the energy released, be used by any of the components of the system?

Can the radiation recharge the lasers?

So this excess energy is just converted to heat within the components of the system?

I would love to have a crack at LENR. I only lean toward LENR, even if it exists, it may not be a viable source of energy.
 
How did I miss that news? That, in itself is a big deal. Everything I read said that we had not yet gotten more out than we put into making a fusion reaction.

It depends how you define "put into". If you measure the energy actually put into the fuel, achieving break even isn't incredibly difficult. However, if you measure the energy put into the system as a whole, you need to get a couple of extra orders of magnitude out before you see a gain. At NIF, for example, the fuel is given about 140kJ which should eventually produce 20MJ of energy from the fusion triggered (it's still under commissioning and testing, so hasn't actually done this yet). However, the machine as a whole will use over 400MJ to actually do that, so obviously it's not going to be very effective as a power plant.

I don't see how NIF is much different.

Inertial confinement fusion is different because you don't have to worry about plasma confinement, since the whole point is that everything stays where it is (hence the "inertial" part). You just drop pellets into a big chamber and shoot them with lasers. Total momentum added is zero, so the ash has plenty of time to cool down before drifting to the walls or being pumped out. ITER seems to get a lot more media attention, but if NIF pans out inertial confinement will likely be much, much easier to turn into commercial power.

How many times a second can it fire?

How many times a day can it fire?

Can the energy released, be used by any of the components of the system?

Can the radiation recharge the lasers?

So this excess energy is just converted to heat within the components of the system?

http://www.lmgtfy.com/?q=NIF
 
Inertial confinement fusion is different because you don't have to worry about plasma confinement

Sure you do.

You just drop pellets into a big chamber and shoot them with lasers.

At which point they undergo a thermonuclear explosion. That explosion, and all its byproducts, has to be confined and controlled.

How long are the walls of the chamber going to last after being repeatedly exposed to the various high energy particles and gamma rays that get released? How long is the energy transfer mechanism going to last? What happens when the explosion is a bit asymmetric due to a defect in the pellet or a problem with one of the lasers? How often do you have to shut the machine down and fix it? How much does that cost, in joules and in dollars?

These are "just" engineering problems, but the problems with tokamaks are just engineering as well. There's no law of physics preventing fusion power from working - and yet, we don't have any fusion reactors.
 
However, the machine as a whole will use over 400MJ to actually do that, so obviously it's not going to be very effective as a power plant.

Yep. One of the reasons I'm optimistic about NIF: there's a lot of room for improvement in the wall-plug efficiency of lasers; this is something that's getting better and cheaper year by year. Even if the individual "shots" never improve beyond NIF's ambitions, you can still imagine getting a power plant out of it just by replacing inefficient 1995-era lasers with efficient 2015-era lasers.

Microwave heaters, magnet power supplies, cryogenic refrigerators, etc., for a tokamak? Those are all mature technologies and operating (within a factor of a few) at peak efficiency. To bring a tokamak from up to breakeven, all of the performance-improvement has to come from improved behavior of the plasma.

That's not an answer to the question "which one will work", but it emphasizes that the *route* towards working is very different in the two cases.
 

I hope this doesn't show up twice. My first attempt didn't go through because I wasn't logged in. I'm new to the board, and apparently I'm unable to post links until I have 15 posts under my belt.


We already have a man made example of fusion energy gain:

wikipedia Fusion_energy_gain_factor
wikipedia Ivy_Mike

Nature:

wikipedia Proton-proton_chain_reaction


Also, I'm curious why you think we should apply more rigor when analyzing fringe claims. It would suggest that we should apply less rigor when considering claims from a trusted source.

I'm not a plasma physicist, but I suspect they will eventually achieve their goal. The energy levels they are producing are record breaking. They are very close to the proposed theoretical limit of ignition 1.4 megajoules (within 86%). It is an iterative process fraught with the technical challenge of maintaining the integrity of sensitive equipment. Will this particular approach be a viable energy source? Maybe not, but it will probably lead to a fundamental and necessary understanding of concepts which lead to a practical design. I'm not just talking about a qualitative understanding of high energy physics. I'm referring to the practical matters of engineering that are unavoidable when considering economy and efficacy of a design.
 
I was just at a seminar by Mike Dunne, director of LIFE at NIF, so I should be able to answer some of the questions here. Particularly interesting is that NIF and LIFE (it stands for the rather uninformative Laser Inertial Fusion Energy, but it's actually the program to design and develop a working fusion power plant based on NIF) are way more advanced than I thought. NIF is actually aiming to reach ignition by September this year, although a couple of delays mean 2012 might be a bit of a struggle. Early 2013 is virtually certain, however. And LIFE has pretty much a full design and cost analysis done, based almost entirely on existing technology and existing manufacturing base, with funding assumed to be entirely private sector and unsubsidised (although I think the headline figure does include some subsidy that applies to anything nuclear).

At which point they undergo a thermonuclear explosion. That explosion, and all its byproducts, has to be confined and controlled.

How long are the walls of the chamber going to last after being repeatedly exposed to the various high energy particles and gamma rays that get released?

The current goal is up to 4 years. No material is capable of lasting very long in such a high radiation environment, so the reactor chamber is instead designed as a modular system which can easily be removed and replaced, and can therefore be constructed just out of steel. The chamber will be filled with low pressure (around 0.03 atmospheres) gas (mainly lead, but I think I remember seeing noble gas mentioned as well, unfortunately I don't have the slides to refer to). This absorbs essentially all the ions and most of the x-rays, leaving the chamber wall relatively unscathed. The gas will have to be cycled to remove the radioactives produced , but this can be done as a slow, constant process rather than having to vent and replace it all at once.

Neutrons pass straight through everything and are absorbed by the coolant sleeve (Again modular and replaceable) surrounding the chamber. The coolant will be liquid lithium, which not only works similarly to the liquid sodium in existing plants, it reacts with neutrons to produce tritium and so can be used to produce fuel, essentially making this a breeder reactor.

How long is the energy transfer mechanism going to last?

As above - the reactor vessel will have a lifetime of a few years, while the coolant will be constantly cycled to produce fuel by removing the radioactive part. Everything past that is just the standard heat transfer part of any power station.

What happens when the explosion is a bit asymmetric due to a defect in the pellet

This is one of the main areas of current research. Since they haven't actually achieved ignition yet, no-one's been able to investigate how sensitive it is to various parameters. The assumption to start with is that everything has to be engineered to the same level as at NIF, but the hope is that some leeway will be possible which would greatly reduce the cost of manufacturing the fuel.

As for what actually happens to a pellet that was dropped wrong or failed to get hit by the lasers or whatever, I don't know. Presumably it would just sit at the bottom of the vessel. It would actually affect subsequent fusion shots, but I don't know if it would cause problems due to funny heating on the vessel and need immediate removal or if it could just be left there.

or a problem with one of the lasers?

How often do you have to shut the machine down and fix it?

The lasers are modular and hot-swappable, using the same basic system that has been used before in previous facilities (although with different specific components obviously). I can't remember the name of the place, but >99% availability with MTBF over 1500 hours has been managed before, and if anything LIFE should be better.

How much does that cost, in joules and in dollars?

For an individual intervention, I have no idea. For overall construction and final electricity costs, it's estimated to be ~$3500/kW capacity and $45/kWh to the consumer. So approximately equal to coal, more expensive than gas, cheaper than nuclear fission, and much, much cheaper than photovoltaic. Obviously this is based on various estimated, and the assumption that NIF will actually mange ignition, but the design and estimates are all done in partnership with a whole pile of major private companies involved in energy production, distribution and engineering, so this is as realistic as possible and not just some pi in the sky numbers from scientists for their pet project.

These are "just" engineering problems, but the problems with tokamaks are just engineering as well.

The difference is that the engineering problems here are mostly solved or have solutions not too far off (one of the few main exceptions being mass production of the actual cryogenic fuel pellet). Even if ITER works perfectly, which hopefully it will, no-one really has a clue how to actually turn it into a commercially viable power plant. NIF already has pretty much the whole thing planned out, and a few bits even partially tested. It's the difference between needing to work out how to implement some of the details on an existing concept and having to come up with the entire thing from scratch.

Look at it this way. ITER is planning to first plasma in 2019, with actual ignition well after that, and DEMO to have actual designs between 2017-2024. NIF is planning to reach ignition this year, and have the first commercial power plants by the early 2020s. Assuming the actual fusion part works (possibly a big assumption, but looking very hopeful) ICF is at least a couple of decades ahead of tokamaks in terms of actually getting fusion into commercial use. Tokamaks are 5 years away from planning to have the basic idea for a design. ICF is already negotiating prices with suppliers for the parts.

They are very close to the proposed theoretical limit of ignition 1.4 megajoules (within 86%).

Actually, they're already operating regularly at the design level of 1.8MJ, have demonstrated that 2MJ is possible, and are looking at increasing it to 2.2MJ and further into the future (2017-2018) potentially as high as 3MJ.
 
Controlled fusion above the break-even point has been achieved many times in tokamaks. That's not the problem. The problem is sustaining it beyond a few moments, and doing so in an economically viable way.

True, and for a practical reactor we are going to need something more in the range of 15-30x the Lawson criterion (eg a mean of say Q=23). For a practical reactor, you are going to have to either get a sustainable burn or sustain a rapid set of pulses and generate many times the energy it takes to create and sustain those reactions.
 
I was just at a seminar by Mike Dunne, director of LIFE at NIF, so I should be able to answer some of the questions here.

Thanks, that was very clear and informative. It certainly sounds good - but I've been hearing good-sounding things about fusion power (including IC) for quite a while. I guess we'll have to wait and see.
 
The current goal is up to 4 years. No material is capable of lasting very long in such a high radiation environment, so the reactor chamber is instead designed as a modular system which can easily be removed and replaced, and can therefore be constructed just out of steel.

Last I heard there was talk of using a waterfall of FLiBe as the first wall. It's a liquid, so banging it with neutrons does not swelling or other problems. (you actually need to blast lithium with neutrons to breed enough tritium).
 

Back
Top Bottom