This content cannot be displayed without JavaScript. Please enable JavaScript and reload the page.

  • Teaching & Learning
  • Technology integration
  • H5P: Create interactive resources
  • Create new H5Ps

Create H5P Image Hotspots

What are image hotspots.

A H5P Image Hotspot (also referred to as an interactive image) is an interactive infographic. A hotspot is a clickable area on an image that expands to reveal more information. This information can be text, image, audio, video, weblink, or a combination. This article explains how a H5P hotspot can be created and examples of where they can be used.

h5p course presentation image hotspot

What can Image Hotspots be used for?

Hotspots can provide clarity and explain the details in an image, allowing students to unpack concepts, and encourage exploration. Hotspots can be used with diagrams, maps and photographs of objects or scenes. Students can explore the information or be directed to other resources using web links.

Example: English Education - Assessment principles in English education

The following H5P image hotspot was used to provide additional information about different assessment techniques. Students start with the bigger picture of the topic and then unpack each technique helping them to contextualise and link the information to other concepts. Initially, an overview of the techniques is presented. Clicking on a hotspot reveals more information; a text paragraph, some brief dot points, followed by an external link to more detailed information. In this way, the hotspot allows students to process the information in 'chunks' and at their own pace. 

Example: Southern Cross University Locations

The following H5P is used in a Welcome to SCU module for academics who are new to teaching at Southern Cross University. The H5P provides an overview of the main campus locations using a map of Australia. Clicking on a hotspot displays images and text that showcase each location. 

Copyright Notification

Click on the "Rights of use" button in the toolbar (bottom left) to see how the copyrighted images were cited in this interactive.

Example: Accounting for Managers - Simple example of a balance sheet

The following image hotspot was developed for an accounting unit to describe the different components of a basic balance sheet. Each component of the balance sheet includes a hot spot that provides relevant information. This hot spot supports students to better grasp the purpose of each balance sheet entry and focus on the parts they are struggling with. 

Create Image Hotspots

Now it's your turn. Follow along with the following video which covers making an Image Hotspot using an image, including how to add copyright information.

Test it out!

Here is the H5P we made in the video above. Try out the interactive content!

Best practice recommendations

The following strategies will assist you in making the most out of an H5P Image Hotspot.

Do provide instructions for users

Provide instructions to users within your learning site on how to use the image hotspot, as well as the kind of information that is within the interactive. For example, the above virology example could be accompanied by prefacing text such as:

"Click on the information icon to show more information about each virus, including the family and a link for further reading"

Don't hide key concepts within a Hot spot

Avoid placing key concepts or important information inside Image Hotspots! Image Hotspots are best used to reiterate information and add additional context.  Where important information is hidden inside an image hotspot, students scanning your unit may miss it entirely.

Do remember to check accessibility

It is important to consider the accessibility of the interaction so that all students will be able to get the most from the experience. Some key ways to ensure accessibility:

  • Use high contrast: When adding a hotspot to an image, you can change the icon and colour. Make sure that the style you chose stands out from the background, so students can easily find the hotspots on the image.
  • Use alternate image text: Ensure that the alternate image text communicates the information of the infographic equivalently so someone using a screen reader receives the same information.

Don't forget image attribution

If you are using an image that is not open source, ensure you credit the source. Image sources can be attributed using the H5P metadata . See Publishing H5P in Blackboard for how to add metadata to a H5P.

Copyright data from within the H5P will be available from the H5P taskbar as shown below. Clicking on "Rights of use" shows the copyright information.

h5p course presentation image hotspot

 For more about using images at Southern Cross University, see copyright .

Step-by-step tutorial

More information about creating an Image Hotspot H5P is Available here:  H5P Image Hotspot Tutorial .

Guo, D., Zhang, S., Wright, K. L., & McTigue, E. M. (2020). Do You Get the Picture? A Meta-Analysis of the Effect of Graphics on Reading Comprehension.   AERA Open .   https://doi.org/10.1177/2332858420901696

Jaleniauskiene, E., & Kasperiuniene, J. (2022). Infographics in higher education: A scoping review. E-Learning and Digital Media. https://doi.org/10.1177/20427530221107774

Paivio, A. ( 1971 ). Imagery and verbal processes.   Holt, Rinehart & Winston .

  • LEaD events
  • IT Self Service Portal

h5p course presentation image hotspot

Create content - H5P

  • Get started - H5P
  • Add H5P activity to Moodle
  • Embedding H5P
  • 1. Drag and Drop
  • 2. Drag the Words
  • 3. Fill in the Blanks
  • 4. Multiple Choice
  • 5. Question Set
  • 6. Single Choice Set
  • 7. True/False Question
  • 1. Accordion
  • 3. Branching Scenario
  • 6. Course Presentation
  • 7. Dialog Cards
  • 8. Flashcards
  • 9. Image Hotspots

Description

Demo - image hotspots, set up image hotspots, guidelines for designing accessible hotspot activities.

  • 10. Image Slider
  • 11. Interactive Book
  • 12. Interactive Video
  • 13. Summary
  • 14. Virtual Tour (360)
  • Download a H5P package for sharing
  • Copy and Paste a H5P content package
  • Make unlisted & public
  • Edit other users' H5P

Image Hotspots makes it possible to create an image with interactive hotspots. When the user presses a hotspot, a pop-up containing a header and text or video is displayed. Using the H5P editor, you may add as many hotspots as you like.

You can configurable:

  • The number of hotspots
  • The placement of each hotspot, and the associated pop-up content
  • The colour of the hotspot 

Learn how to create Image Hotspots in this tutorial .

General guidelines

  • In the Description field, provide clear and concise instructions for the hotspot activity. If hotspots link to different content formats (e.g., video clips), explicitly mention this in the description. 
  • Label each hotspot with descriptive text that unambiguously conveys what the hotspot represents or links to. Avoid relying on text embedded within images. 
  • If the order of hotspot usage is important, explain this in the instructions and include numbers in the labels to indicate sequence. 
  • Ensure that hotspots and background colours have sufficient contrast to accommodate users with visual impairments. 
  • Ensure that text within the hotspot activity meets minimum size requirements for readability.
  • Provide descriptive alternative text (alt text) for any images or graphics used within the hotspot activity. This is essential for students with visual impairments. 

Video and audio content

  • If your hotspot activity includes video elements, make sure that it has accurate captions or transcripts for students with hearing impairments. 
  • Ensure that video or audio content within the hotspot activity does not start playing automatically as it can be disruptive to some users. 

Additional considerations

  • Consider offering alternative formats or activities for students who may have difficulty with the hotspot interaction. Examples include providing a text-based alternative or offering a different assignment.
  • << Previous: 8. Flashcards
  • Next: 10. Image Slider >>
  • Last Updated: Aug 21, 2024 12:44 PM
  • URL: https://city-uk-ett.libguides.com/staff/moodle/h5p

Creative Commons Licence

Logo for The University of Regina OEP Program

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

There are three types of hotspots. ‘Image Hotspots’, ‘Find the Hotspot’ &  ‘Find Multiple Hotspots’ — using right and wrong placements

Geography Find Multiple Hotspot Example

Economics Example

Political Science Example (with optional video)

Japanese Image Hotspot Example (with optional sound)

H5P Examples Copyright © by Michelle van Ginneken. All Rights Reserved.

Share This Book

Logo for Toronto Metropolitan University Pressbooks

H5P Content Types

Image Hotspots

  • The number of hotspots
  • The placement of each hotspot, and the associated popup content
  • The colour of the hotspot

Example of Image Hotspots Content Type

Adding Interactivity to Pressbooks Copyright © 2018 by Sally Wilson is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Creating an H5P Course Presentation

With the H5P Course Presentation tool, you can create and present asynchronous educational material to your students with slides that incorporate multimedia and interactive features. This format enables you to assemble your course material in a dynamic and interactive way. Students can navigate through the slides, engaging with embedded quizzes and videos as they go. It's an effective alternative to traditional methods like PowerPoint slides, PDFs, or standard text-heavy web pages. 

The H5P Course Presentation allows educators to create and present content in an engaging way, consistent with Active Learning principles. The Course Presentation can also be used for formative tasks to provide students with opportunities for self-assessment and consolidation of knowledge.

Exportable Text Areas in a H5P Course Presentation also offer a unique opportunity for students to engage with your content. These text areas can be embedded throughout the presentation, allowing students to record their reflections, ideas, notes, and answers to questions as they interact with it. Anything the student inputs in these fields will be collated into an individual summary that they can download as a Word document.

Preview an example H5P Course Presentation (opens in new tab) , that includes Fill in the Blanks, Video, and Multiple Choice questions.

Moodle - H5P - Course Presentation Editor

The Slides tab and settings menu

H5P Course Presentation Slides Menu

The top menu

H5P course presentation menu media

  • Add or format text,
  • Add or update a hyperlink,
  • Add or update an image,
  • Add shapes (square, circle, horizontal or vertical lines),
  • Add or update a video,
  • Add or update a link to a slide within the H5P Course Presentation,
  • Add or update an audio file. 

H5P Course Presentation Add Interactivity

  • Fill in the Blanks
  • Single Choice Set
  • Multiple Choice
  • True/False Question
  • Drag and Drop
  • More elements. This includes other less common, but still useful options including Tables, Interactive Video , Exportable Text (covered later in this article) and more. 
  • Paste content from the clipboard. 

Slide options

H5P Course Presentation Slide Options

From left to right: 

  • Show or hide the sidebar navigation,
  • Current slide / Total slides. In the image above, the Course Presentation is displaying slide 2 of 3,
  • Add new slide,
  • Clone slide,
  • Slide background - please be mindful of colour contrast requirements when selecting background colours and images,
  • Move slide left - e.g. move the current slide (slide 2) to become slide 1 in the order,
  • Move slide right - e.g. move the current slide (slide 2) to become slide 3,
  • Delete slide.

Moodle - Edit Mode On

  • Enter alt text into the Alternative text field.
  • (Optional) You can provide hover text – text displayed when students hover their mouse pointer over an image.
  • Comments can also be added to the image that are displayed at the end of the presentation when students choose to display suggested answers.

Moodle - H5P - Course Presentation Editor - Add Image - Metadata Button

  • Drag and drop the image to move it to the desired position on the slide.

Moodle - H5P - Course Presentation Toolbar - Text Button

  • Use the Edit text toolbar to format text.
  • Comments can also be added to the text that is displayed at the end of the presentation when students choose to display suggested answers.
  • Background Opacity: a 0 value means the text box will be transparent, and a 100 value means the text box will have a white background.
  • Check Display as button to replace the text box with a button students click on to read the text in the pop-up.

Moodle - H5P - Course Presentation Editor - New Slide Button

  • The text entered in the Title text field will be displayed on the slide.
  • Enter the URL into the URL text field.
  • Comments, opacity, and the ability to display the link as a button rather than text are the same as described above for images.

Moodle - H5P - Course Presentation Toolbar - Summary Button

  • The Summary activity allows students to make an interactive summary of the key points in the Course Presentation.
  • After adding introductory text or a question, add statements that summarise the Course Presentation. Note: The first statement is the correct statement.

The Exportable Text activity allows students to write into text fields within an H5P Course Presentation and export the collated text to a Word document.

You can include prompts for the students in the text boxes, as well as generic automated feedback in the exported document.

Some potential use cases include:

  • Summarising the subject material in the student's own words for note-taking,
  • Reflecting on stimulus,
  • Writing pre- and post-evaluations on perspectives prior to engaging in another activity, e.g. prior to and after completing a module,
  • Drafting components of essays with integrated feedback.

With the ability to export the file as a Word document, students can easily upload the document to a Moodle Assignment for marking. Students can edit the Word document, once downloaded, as with any other Word document.

Steps for Creating an Exportable Text Activity

To create an Exportable Text activity in the H5P Course Presentation:

  • From the H5P Course Presentation, click More elements (three vertical dots) to show more options.

H5P Course Presentation Exportable Text

  • Enter a Label/heading to display the answers in the exported document.
  • To include prompts in the text entry boxes, enter the text for the prompt and click on Always display comments . This will then show as a question mark popup in the corner of the text entry area, as displayed below.
  • To automate feedback, ensure Include comments in the exported document is selected, and uncheck Always display comments.
  • Click and drag to move or resize the Exportable Text Area on the slide. 

H5P Export Text box

  • Click the Save and display button to view the H5P Course Presentation.

Related information

  • Creating Multiple Choice questions using H5P | L&T Hub resource
  • Creating interactive videos with H5P | L&T Hub resource
  • Creating Fill in the Blanks Activities with H5P | L&T Hub resource
  • H5P Documentation - H5P Course Presentation tutorial | External resource
  • UOW Office Kit available via the intranet | UOW resource

Contact Learning, Teaching & Curriculum

Request support

Contribute to the Hub

Provide feedback

Image Hotspots

Image Hotspots makes it possible to create an image with interactive hotspots. When the user presses a hotspot, a popup containing a header and text, image or video is displayed. The author may add as many hotspots as you like.  

  • Image Hotspots Create an image with multiple info hotspots
  • << Previous: Fill in the Blanks
  • Next: Interactive Book >>

Teaching and Learning Technologies

Using Lumi and H5P to create Easy Images with Hotspots, Timelines,  and Sequencing

What is lumi and h5p.

H5P is an open source application to create interactive and engaging learning objects quickly and easily.  Up until now it required an on-campus installation but now there’s an application called Lumi Education that allows you to create these interactive objects and even embed them into OAKS for your students to use.

How It Works

H5P Editor Start

  • Select your operating system to download the app for your computer. NOTE : if you get sent to a page asking for a donation, just click the back button and do it again.  Next time it shouldn’t ask you.
  • Save the installer, then install the Lumi app.
  • Once it’s installed, click on the Lumi app to open it.
  • Under H5P Editor choose Start .
  • Choose Create New H5P .
  • Next to the item you want to create, click Get to load this option into your Lumi app.
  • Click Install (if first time).
  • Then click Use .

screenshot of tutorial and example

Image Sequencing

image sequencing content type that allows authors to add a sequence of their own images (and optional image description) to the game in a particular order. The order of the images will be randomized and players will have to reorder them based on the task description.

You and your students can use these to:

  • learn process
  • sort anything

Question showing sorting planets by size

The Timeline content type allows you to place a sequence of events in a chronological order. For each event you may add images and texts. You may also include assets from Twitter, YouTube, Flickr, Vimeo, Google Maps and SoundCloud.

Timelines can be used for more than dates.

screenshot of timeline

Hotspot Images

Image hotspots makes it possible to create an image with interactive hotspots. When the user presses a hotspot, a popup containing a header and text or video is displayed.

  • Expand the information in an infographic
  • Explain data in a map or add history to a map
  • Explain specific details of artwork
  • Create games
  • Great for foreign language or english vocabulary

screenshot of hotspot map

REMEMBER: each learning object type in Lumi contains a built in tutorial so be sure to use those to learn how to create these.  Most are self-explanatory and easy to learn but having the tutorials is handy.

[button link=”https://docs.google.com/document/d/1ORFHEy2U0YkHImzc5qS0zs34PoaBfbhpaBNIZDjgZfc/edit?usp=sharing” newwindow=”yes”] View a tutorial on how to add these to OAKS![/button]

Share this:

h5p course presentation image hotspot

Published by Mendi Benigni

View all posts by Mendi Benigni

  • About WordPress
  • Get Involved
  • WordPress.org
  • Documentation
  • Learn WordPress

Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications You must be signed in to change notification settings

Create images with hotspots

h5p/h5p-image-hotspots

Folders and files.

NameName
496 Commits

Repository files navigation

H5p image hotspots.

Create images with hotspots. For each hotspot you can define header and text content which will be displayed when clicking the hotspot.

(The MIT License)

Copyright (c) 2015 Joubel AS

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Contributors 64

  • JavaScript 77.4%

Chris Marler Photography

Blogs : Photography Guide To The Palouse

h5p course presentation image hotspot

I’ve seen a few blogs and online Palouse photography guides, but none that were comprehensive enough to answer detailed questions that photographers have about this region. So, I decided to fill that void and write one myself. What makes me qualified to write a photography field guide to the Palouse? For starters, I grew up in Idaho and spent four years at the University of Idaho which is smack dab in the heart of Palouse country. During those college years, I explored the area and all its back roads, state parks, remote towns and bars. Many of my college friends are from the Palouse area and still live there today. Thirty something years later, I still return to the Palouse every year to either visit my sister-in-law, attend alumni events, and to photograph this beautiful landscape.

I took up nature photography shortly after college and have spent hundreds of hours since then taking images of this region. The Palouse is a short half-day drive from my home near Seattle, and I can’t wait to go back each year. Every season is different, and I always come back with new and fresh images. I think I’ve been on every dirt road in the county and know many of the twists and turns of the rolling hills by heart. Sometimes I will see a photo of an old barn or even a tree, and chuckle since I know exactly where that subject is. It’s almost to the point that I can’t get lost on the backroads anymore. I hope this article helps you answer the following questions, and please shoot me an e-mail if you have any questions. Where is the Palouse? - Location Information Geology of the Palouse - Brief history of how this land was formed What can I photograph? - Photography subjects and ideas How do I get there? - Travel Information - Airports and Roads Where do I stay? - Local hotel and motel recommendations How do I get around? - Best mode of transportation How long should I stay? - Recommended number of days needed to photograph the area Best Time to Visit - My favorite months to photograph the Palouse Best Locations - Specific locations for photo opportunities Palouse Photography Maps Links to detailed maps of the backroads

Where is the Palouse?

Let’s start with the basics. The Palouse is a unique geographic region rich in agriculture that encompasses parts of eastern Washington and northern Idaho. Although I’ve heard it covers as much as 5000 square miles, the photogenic area is closer to 3500 square miles. I consider the prime photography range to be from Lacrosse, Washington to the west to Troy, Idaho on the east. From north to south, I would stay between Oakesdale, Washington and Uniontown, Washington.

Special Notes: There is a small town in Washington State named Palouse, which is within the boundaries of the general “Palouse area”. This can cause confusion, so it’s best to clear that up right away. Also, Palouse Falls is an extraordinary waterfall that most people have heard of, but it is outside the general boundaries of the Palouse area, and will be covered later as an individual topic.

The Palouse is an agricultural mecca, primarily producing wheat, legumes, lentils, barley, and chickpeas. In fact, it’s widely recognized as the most productive farmland in America for wheat. Canola is also grown here, which is easy to identify in the spring due to its vibrant yellow color. These gentle rolling hills were formed over tens of thousands of years ago from wind-blown dust and silt, called “loess”. The dust bin of last Ice Age was primarily to blame, as winds picked up the fine sediments from glaciers and deposited them in this region, creating a landscape of rolling hills that resemble sand-dunes in shape and size. The Palouse is truly unique both in geology and fertile farming grounds, and it is now recognized as one of the top places in the United States for photography.

You can find additional geology information on the Wikipedia Palouse page.

What can I photograph there?

Landscape photography.

Field of View

Limited Edition Prints Available

It’s all about the rolling hills and fields. The Palouse landscape is what photographers flock here for. What makes the Palouse so unique and visually striking is the endless rolling terrain. Texture, color, and shape make this a paradise of sorts for landscape photographers, especially those seeking pattern and abstracts in their images. In the spring months of May and June, the rolling hills are mix of green wheat fields, yellow canola fields, and deep brown soil. In summer, fields of gold dominate the landscape. This image titled "Field of View" is typical of the type of photo you can expect to return with.

Farm Life Photography

You may not have considered other photography opportunities besides the landscape. But there is so much more. Here’s a list of other possibilities that can be found all over the Palouse.

Old barns and farmhouses – There are old barns and abandoned farmhouses scattered throughout the region which make striking subjects all by themselves, or great foregrounds surrounded by the rolling hills and fields. Some of the barns are known to be over 100 years old. My all-time favorite barn was along a dirt road I stumbled across back in the late 1990s. Unfortunately, time took its toll and it is no longer standing. But there are plenty of barns still around.

Farm Equipment – Some of my favorite images from the Palouse are farmers riding in their tractors and combines working their fields. As they plow the steep hills and valleys, dust and dirt kick up from the machines and make for some dramatic scenes. I find It rather amazing and I assume a little dangerous that they can plow this steep terrain. There are also plenty of standing farm equipment all over that can be easily used for foreground material in your photography.

Granaries – Many of the Palouse photos you see include granaries, especially those taken from Steptoe Butte. They certainly add context to your images.

Crop Dusters – Many of the farmers here are also pilots. Crop dusting is still the primary way of spreading pesticides, and crop dusters are easily spotted and photographed. They fly very low to the ground to spray and the white plume often makes for striking images as pass right overhead.

h5p course presentation image hotspot

Wildlife Photography

Stare Down

Wildlife photography isn’t why nature photographers visit the Palouse, but there are a few opportunities.

Whitetail Deer - As you might expect, deer love this area. You will certainly see the Whitetail deer in the wheat fields especially closer to dusk. I can’t tell you how many times I have rounded a corner and come across a beautiful scene with deer close by, only to come away with no images. That’s because I usually have a wide-angle lens attached to the camera for landscapes, and by the time I switch to a longer lens, the deer are halfway across the county. It might be worth taking an extra camera with a long lens attached just for this reason.

Coyotes – You can certainly hear the coyotes during the sunset hour at Steptoe Butte. They stay out of sight during the day. But if you are an astrophotographer, its likely you will encounter one or two.

Hawks – Red-tailed Hawks and Cooper’s Hawks frequent the area. They love to perch on fenceposts and can be easily spotted.

Quail - You are very likely to see quail at some point during your trip. I love these birds, but they are very skittish (and fast), so getting quality photos is challenging to say the least.

Pheasants - I think Ring-Necked Pheasants are one of the more stunning birds in North America. They used to be everywhere in the Palouse. Sadly, their numbers have dwindled and it takes a little luck to see them anymore. But they are still reside in the area, so keep an eye out.

Great Horned Owls - There is another species that lives here which may surprise you, the Great Horned Owl. Specifically, they like to hang out in the forested area of Kamiak Butte. Kamiak Butte requires a bit of a hike through a forest area to reach the top and get spectacular views of the valley below. While hiking up the trail, keep a close eye out for the owls that live in the treetops. I see them more often than not when hiking this trail, and sometimes at close range. I must say my best image ever of a Great Horned Owl was taken at Kamiak Butte. I titled this photo "Stare-Down" for obvious reasons. For a complete listing of all the birds of the Palouse, take a look here.

Birds of the Palouse

College Campus Photography

Perhaps photographing college campuses isn’t your thing, but there are two campuses in the Palouse if you want to try something different. During the harsh mid-day sun, I often look for other photography options anyway.

Washington State University resides in Pullman, which isn’t a small college since it has over 29,000 students. It’s a Pac-12 school which happens to be in a rather small town.

The University of Idaho is just across the state line in Moscow Idaho. It's only nine miles down the road from Washington State University. It has approximately 12,000 students and is a member of the Big-Sky conference. In my opinion, the University of Idaho campus is more photogenic. Photographing its Greek row of fraternities and sororities is a fun way to spend a few hours. Since I am an alumni of the University of Idaho (Go Vandals!) I fully admit to being biased. Autumn is usually the best time to take images of college campuses, but you can have the entire campus to yourself if you visit in June or July.

Astrophotography

There's not a major city within 100 miles, which means dark skies. The eastern side of Washington State is nothing like its more famous Pacific Northwest counterpart on the west side. The cascade mountain range separates the Seattle area and its notorious cloudy rainy weather with the more open terrain, sun drenched farmland, and clear skies on the east side of the state. It's a very underrated location to shoot the night sky.

Road To The Stars

For astrophotographers, the Palouse can be a gold mine. I typically find an old barn or interesting country road to use as foreground material, then frame the Milky Way over it. Check out the image on the left. For this one, I stood in the middle of a highway for half an hour to get this shot. I never had to move due to an approaching vehicle. The only noise I heard was from a few coyotes.

The most difficult aspect of astrophotography in the Palouse is trying to find a structure like an old barn which has some separation from the private residence. Private residences will have their lights on which definitely interferes with night sky photography. So planning is key. While out on your photo shoot during the day, it’s important to make mental notes of possible locations for a return trip that evening. Arrive before sunset at your pre-determined location, get set up, then wait it out. Trying to find a good spot after dark will be next to impossible.

I should mention the northern lights as well. The aurora borealis can make an appearance in Washington state, albeit not very often. It has to be a very strong show for us to see the northern lights at this latitude. But it does happen from time to time. When that event occurs, the east side of the state gives you a much better opportunity than the Seattle area due to fewer clouds, more open areas, and darker skies.

Field of Dreams

Thunderstorm Photography

Palouse Thunderstorm

I love photographing thunderstorms. Seattle rarely has these types of violent fast-moving storms with thunder and lightning, but the Palouse has them. Dark clouds on the horizon and a change in weather can move in fast and create extraordinary opportunities for dramatic landscape images. Granted, a late afternoon storm may ruin your pre-visualized idea of that beautiful sunset image at Steptoe Butte State Park, but it can open up more eye-popping possibilities. Last June I was fortunate enough to watch a thunderstorm move in over the rolling hills, and the light and drama it produced will stay with me forever. I almost thought this is what it must feel like in the Midwest during one of these. The point is don’t be discouraged because of weather. Go with the flow and you will be rewarded. Most likely the next day will be gorgeous.

How do I get there?

If you require air travel, the closest airport is Spokane International Airport. From Spokane, it’s only a one-hour drive to reach the Palouse area, by taking I95 south to Colfax Washington. There is a very small regional airport in Pullman, Washington (Pullman-Moscow Regional Airport) which is closer but requires smaller planes and has limited arrival & departure destinations.

If driving, I recommend using Google Maps and plug in Colfax Washington or Pullman Washington as your destination. More on specific accommodations in the next section. From Seattle, take I90 to the small town of Vantage, then WA-26 all the way to Colfax or Pullman. If driving from the east on I-90, take US-95 south near Coeur d’Alene Idaho, then it’s another 90 minutes to the Palouse.

Where should I stay?

There are three towns to choose from: Colfax Washington, Pullman Washington, and Moscow Idaho. I’ll discuss each briefly and provide links to recommended accommodations.

Colfax, Washington

I prefer Colfax since it is more centrally located in the prime photography areas, and closest to Steptoe Butte State Park. Steptoe Butte is where you often end up for sunset, so Colfax provides the shortest drive back to the hotel at the end of your day. There is a caveat. Colfax has very little in terms of restaurants, pubs, etc. It’s not completely void of food, but Zip’s Fast Food Drive Through is rated their 4 th best food option. Just saying. Personally, I can load up with a free breakfast at the Best Western, have a Subway Sandwich in my car for lunch, and put up with the same restaurant in Colfax for a few nights. That inconvenience is worth being close to the prime spots.

Best Western Wheatland Inn - There aren't many hotels near Steptoe Butte. My favorite spot to stay when photographing in the Palouse is the Best Western Wheatland Inn. Clean rooms, reasonable prices, a free breakfast, and most importantly, it is the most central location to photography hotspots. There are a few other accommodations in Colfax. but if I can’t get a room here I would book a room in Pullman.

Pullman, Washington

Pullman is a great second option. In fact, I’ve stayed there many times and have no complaints. If Steptoe Butte is not on your photo agenda for sunset, then Pullman may actually be a better option than Colfax. There are many restaurants and a few pubs here, so it’s a nice place to wrap up your day. Pullman is very close to the main backroads you will be exploring.

Holiday Inn Express – My first choice in Pullman. I prefer this Holiday Inn Express because it’s located on the edge of town right off Highway 27, making it very accessible. It also provides a free hot breakfast and is walking distance to the Birch & Barley pub. This hotel is a little pricey and should be booked far in advance, especially if you visit while Washington State University hasn’t ended school yet.

Quality Inn Paradise Creek – A nice alternative is the Quality Inn, just a short walk from the Holiday Inn Express. It also has a free hot breakfast and is usually more reasonably priced than the Holiday inn Express. It doesn’t have quite the amenities the Holiday Inn Express has. But then again, how much time will you be spending at the hotel? Probably not much.

There are several other accommodations in Pullman, if these two don’t meet your satisfaction.

Moscow, Idaho

Moscow is nine miles east of Pullman and on the eastern edge of the great photography hotspots of the Palouse. This means staying in Moscow will result in more driving around than is necessary. Since it’s likely you will be in your car most of the day anyway, additional driving is something you want to avoid. I love Moscow but consider it my last option when visiting the Palouse to photograph. If you stay in Moscow, I recommend a hotel on the west side of the town.

La Quinta Inn - A very nice hotel on the west side of Moscow. Free breakfast is included. I’ve stayed here before and have no complaints, other than the additional driving as I mentioned above.

Best Western Plus University Inn – The Best Western is also located on the west side of Moscow, not far from the La Quinta. I would choose one of these hotels.

How do I get around?

There is only one answer, by automobile. Photographing the Palouse is done by driving. This is rural America and you need a vehicle to get around. You will be in the car most of the day and will travel down a lot of roads. Some are highway, many are well maintained gravel roads, and there are dirt roads. The dirt roads are challenging due to rocks and possible heavy mud after a rainstorm. I highly recommend an SUV since some of the old country dirt roads can be a bit rough. Another benefit of an SUV is being able to spread out your gear, especially if you have a few other photographers with you.

One thing is for sure. Your vehicle will get dirty. There’s just no avoiding dust and mud kicking up from the gravel roads and painting your car silky grey. Part of the fun!

What lenses do I need?

Enough logistics. Let’s get to the fun stuff. Since this is primarily a drive and shoot excursion, there’s no need to restrict your gear. A typical day is spent driving around the countryside, locating a great scene, parking the car on the side of the dirt road, and getting out right there to take some images. The vast majority of your time photographing will be within fifty yards of your vehicle. That means all your gear can be thrown into the car and accessed as needed.

There are no lengthy hikes (with the one exception of Kamiak Butte) or even many long trails to walk. Since most of the fields are private property and you can’t just go tramping through their wheat and canola crops, I usually photograph from the dirt roads close to my vehicle.

As you might suspect, a wide-angle lens is a must. If I was forced to use one lens for a day in the Palouse, I would grab my Nikkor 17-35. I love those images which showcase the grandeur of the area, with barns or granaries included as a subject of interest. My next go-to lens might surprise you. I would reach for a mid/long range lens, such as 300 F4 or 80-400 F5.6. There are so many occasions when you want to isolate an area on the Palouse, compress a scene to its bare bones, and highlight dramatic abstracts or patterns in the fields. This is especially true on Steptoe Butte, where a long lens is essential to isolate sections of the Palouse countryside below. Finally, that 24-70 will be useful for those shots in-between wide angle and isolated areas.

That’s a long-winded way to say all lenses can be used effectively here. In fact, I think you may be surprised at how often you change lenses in the Palouse. If you have two cameras, the ideal situation is to have a wide angle mounted on your prime camera, and a long lens on the other. That way you can avoid changing lenses which increases the risk of dust getting into your camera.

How long should I stay?

Three or four days should be sufficient to see most of the area and come away with some great images. That does not include a trip to Palouse Falls State Park, which will be a special topic later in this article. Some photo workshops last one full week, but that certainly isn’t necessary.

Best time to visit

Short answer: Spring and Summer

The undisputed best time to photograph the Palouse is late spring. Late May through June are considered the prime time because the wheat fields turn a gorgeous green, canola fields are bright yellow, wildflowers have emerged, and the weather is very nice with temperatures typically in the 80s. If canola fields are a priority for you, I suggest Late May or early June. The overwhelming number of photographs taken in the Palouse are from this time of year. It is very popular now with photographers now, so be sure and book ahead.

Another fantastic time to visit the Palouse is during the wheat harvest. Wheat harvesting begins mid-late July and extends until roughly mid-August. The fields change from their brilliant green hues to a beautiful gold/amber color and the entire region takes on a completely different feel. The farmers are extremely busy during this period, the temperatures are hot, and the Palouse seems more dynamic and active. I think more dramatic and stark images can be captured in August. This is also the time for more frequent thunderstorms, which can result in amazing images for your portfolio.

The rest of the season is mostly ignored by photographers, although a nice snowfall during the winter months can reward you with some unique images of the Palouse.

Best locations

Let’s get to the heart of this blog. Listed below are my favorite locations to photograph in the Palouse, some well-known and some lesser known. Lets' start with more popular locations. Here are the top hotspots that every photographer needs to hit.

Steptoe Butte State Park

Land Waves

Steptoe Butte is by far the most popular location in the Palouse for photographers, with good reason. The 3,612-foot butte towers above the entire Palouse region and provides a 360-degree view of the countryside below. At the top, the butte is approximately 1000 feet above the rolling hills and is part of a 150-acre state park recreation area. A narrow-paved road winds around the butte, leading to a parking at the summit. There are several small turnouts on the way up. This makes Steptoe Butte accessible to everyone.

The views here are unobstructed and quite breathtaking. Sunset here is a must, but not so much for the orange colors that can occur at sunset. The low light paints the rolling hills with a beautiful soft glow that lasts for about an hour. Since the butte is quite a distance from the hills below, I prefer a longer lens (e.g. 300 F4) here to isolate sections of the rolling hills and granaries. Most of the jaw-dropping images you see from the Palouse were probably taken from Steptoe Butte, including the image titled "Field of View" below.

Back in June of 1995, I remember photographing at the summit with slide film and I was the only person there. Nobody photographed this area back then. But the secret is out, and those days are long gone. Steptoe Butte now attracts hundreds of photographers on any given night each spring. There’s probably no image you can capture from this vantage point that will be unique. But nonetheless, it’s a bucket list destination for every photographer. Plan one or two evenings at this spot.

Roll With It

Dahmen Wagon Fence Farm

Dahmen Wagon Fence Farm

The Dahmen Wagon Fence Farm is an old farm that now houses a very small gift shop just north of Uniontown, Washington. It has a very unique fence made from rusty wagon & tractor wheels that encompasses the entire farm. I’ve never seen anything else quite like it. It’s definitely worth an hour or two of your time. I enjoy walking around the perimeter of the fence and getting images of various fence spokes with the barn included in the scene. The rolling wheat fields are present as always, so another fun image is using the wagon wheel fence as the foreground against the fields as the backdrop.

The drive down to Uniontown is worthwhile as well. You will find more rolling hills, curvy roads, and farm scenes to photograph around Uniontown. Don’t forget to make a stop in the small town to grab an ice cream cone or two.

Heidenreich Dairy Barn & Truck

Rural America

One of the most popular photos from the Palouse is an old classic orange truck in front of a picturesque red dairy barn. You probably know the image I’m referring to. This is the Heidenreich Dairy Barn, which is located on State Route 272 just north of Colfax, Washington. The owners of the property are very friendly and are happy to welcome photographers to take pictures, as long as they keep some distance from their home. I find this is the case with most residents of the Palouse. As long as you respect their property and show common courtesy, they have no problem sharing their local roads with the photographers that flock there every year. There is no parking at the Dairy Barn, so your vehicle needs to be parked alongside the highway as far as possible off the road.

Palouse Falls

Let’s talk about Palouse Falls. It deserves a special attention, because the falls ARE NOT LOCATED within the general Palouse region we have discussed up to this point. Palouse Falls State Park is located approximately 70 miles southwest of Colfax. If you plan to go, dedicate most of the day for this excursion, since several hours will be needed to drive back and forth from Colfax. It is outside the photogenic area of rolling wheat fields, and the surrounding area near Palouse Falls resembles more of a desert with sagebrush and dry weeds. So, is it worth a day? Absolutely!

As you make your drive into Palouse Falls State Park, you will be wondering where the falls are. The miles leading up to the park are relatively flat with no water in sight. But once you arrive and take a short walk to ledges, you will be peering down into the canyon with the Palouse Falls gushing 198 feet into the river below. From a photography perspective, it’s actually best mid-morning or mid/late afternoon. The early morning hours and sunset hours cast dark shadows into the canyon and the scene has too much contrast. What you hope for are high thin clouds, then you can shoot all day long.

If you are into astrophotography, don’t miss this opportunity. A short hike takes you up a ridge where you can set up before sunset and wait for the Milky Way to make its appearance. You will be standing precariously close to the edge, so exercise extreme caution if you decide to do this. Take note of the photographer in the scene below. That's where you want to be. A wide angle lens (e.g. 14-24) can get the waterfall and Milky way in the same scene. You will want to take separate images for foreground and night sky, then blend them together in Photoshop later. But there’s not many places I know of where you can get Milky Way images emerging over a waterfall like this.

h5p course presentation image hotspot

I also want to point out a few lesser known spots that are fantastic. These locations do not have crowds or hoards of photographers nearby. The best thing about the Palouse is you can escape to any area of the region and find somewhere you have all to yourself. I love those moments.

Uniontown Red Barn

Red Barn

Just south of Uniontown lies one of my favorite barns in the entire Palouse area. It’s a large Dutch-gambrel style barn sitting in the middle of a wheat field on slow rising hill and big skies behind it. What makes this image so striking are the kaleidoscope colors. Bright red paint, green wheat fields, yellow canola fields, and hopefully a bright blue sky as a backdrop. It's almost like a painting.

This barn is easily photographed from the road. The sign on the front of the barn adds a little extra touch. There are barns similar to this all over the Palouse, but I have to say this is probably my favorite.

Moscow to Troy Backroads

The vast majority of photographers stay on the Washington state side. But I’ve found some hidden jewels just east of Moscow, Idaho along Highway 8 all the way to Troy. You can find canola fields, old barns, and roads that lead to elevated hillsides with nice vistas.

Pullman to Palouse Backroads

I strongly advocate getting lost on the backroads. This is probably the best way to experience and photograph the Palouse. I guarantee you will find some incredible images with this not-so-precise strategy. A great plan for the day is get a full tank of gas, head north from Pullman, and turn down any dirt road you can find. Get as lost as you can be. So where to start? My favorite backroads are north of Pullman all the way up to the town of Palouse. There are so many gravel and dirt roads to explore. This image titled "Sunbeam" was captured in this area during an approaching storm. I wish I could tell you exactly where I was, but I honestly don't know.

LIMITED EDITION OF 150 SUNBEAM - Getting lost on the Palouse country backroads is a great way to spend the day.  There’s no...

Prints Available

Is there a photographer’s map to the Palouse? The answer is yes. It was created by Teri Lou Dantzler, and is simply the best detailed photography map of the Palouse around. I've included a link to her website below. Teri Lou has assembled over 150 specific locations contained in a map and the $25 will save you a lot of time and effort. Unfortunately, the Pullman Chamber of commerce has “borrowed” many of Teri’s maps and now gives them away for free. I hope people can appreciate how hard it was for someone to assemble a photographer’s map like this, and we should all reward Teri for her amazing work by purchasing her map.

Map of the Palouse

Related Posts

AIP Publishing Logo

  • Previous Article
  • Next Article

I. INTRODUCTION

Ii. deep roots: the early 1950s, iii. the early days of the llnl icf/laser program, a. the very early days, b. the nature paper, c. the switch to indirect drive, iv. the lessons learned from those early lasers, a. hot electrons, b. the birth of high energy density physics (hedp), c. a color change, d. a green light and x-ray lasers, e. the nova laser, v. getting nif approved and preparing for its completion, a. getting nif approval, b. nif construction, c. preparing for nif experiments, d. red teaming, vi. surprises upon the startup of the full nif laser, vii. changes to the point design followed by steady improvements in performance, a. overview, b. high foot, c. hohlraums with low gas fill, d. high density carbon (hdc) ablators, viii. changing scale, conquering symmetry, pushing longer, and achieving ignition, a. larger scale, b. symmetry, c. pushing longer, d. achieving ignition, ix. the future, x. lessons learned, acknowledgments, author declarations, conflict of interest, author contributions, data availability, the long road to ignition: an eyewitness account.

ORCID logo

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Reprints and Permissions
  • Cite Icon Cite
  • Search Site

Mordecai D. Rosen; The long road to ignition: An eyewitness account. Phys. Plasmas 1 September 2024; 31 (9): 090501. https://doi.org/10.1063/5.0221005

Download citation file:

  • Ris (Zotero)
  • Reference Manager

This paper reviews the many twists and turns in the long journey that culminated in ignition in late 2022 using the laser heated indirect-drive approach to imploding DT filled targets at the National Ignition Facility (NIF), located at the Lawrence Livermore National Laboratory (LLNL). We describe the early origins of the Laser Program at LLNL and key developments such as the paradigm shifting birth of high energy density physics (HEDP) studies with lasers, changes in choice of laser wavelength, and the development of key diagnostics and computer codes. Fulfilling the requirements of the multi-faceted Nova Technical Contract was a necessary condition for the approval of the NIF, but more importantly, the end of the Cold War and the cessation of nuclear testing were key catalysts in that approval, along with the ready-and-waiting field of HEDP. The inherent flexibility of the field of laser driven inertial confinement fusion played a fundamental role in achieving success at the NIF. We describe how the ultimately successful ignition target design evolved from the original “point design” target, through the lessons of experiment. All key aspects of that original design changed: The capsule's materials and size were changed; the hohlraum's materials, size, laser entrance hole size, and gas fills were also all changed, as were the laser pulse shapes that go along with all those changes. The philosophy to globally optimize performance for stability (by raising the adiabat and thus lowering the implosion convergence) was also key, as was progress in target fabrication, and in increasing NIF's energy output. The persistence of the research staff and the steadfast backing of our supporters were also necessary elements in this success. We gratefully acknowledge seven decades of researcher endeavors and four decades of the dedicated efforts of many hundreds of personnel across the globe who have participated in NIF construction, operation, target fabrication, diagnostic, and theoretical advances that have culminated in ignition.

On December 5, 2022, researchers used 2.05 MJ of laser energy from the National Ignition Facility (NIF), located at the Lawrence Livermore National Laboratory (LLNL), and aimed it into a “hohlraum”: a cylinder of high Z material whose walls were made from depleted uranium (DU) lined with a thin layer of gold. At the center of that hohlraum was the capsule: a 2-mm diameter shell of high-density carbon (HDC) of thickness ∼170  μ m. Inside that shell was a concentric shell: a 75- μ m-thick layer of frozen DT. The laser pulse was directed at the walls of the hohraum, not at that central capsule. In this “indirect drive” approach, the laser was absorbed on the walls and electron conduction deeper into the walls heats an over-critical plasma to a sufficient temperature to be the source of x rays that was then absorbed and reemitted by the walls of the hohlraum, creating a bath of x rays that ablated the outer surface of the capsule. In a rocket-like reaction to that ablation, the x rays generated by a temporally shaped laser pulse launched a sequence of shocks into the imploding capsule that kept the DT fuel on an adiabat reasonably close to that of Fermi-Degenerate plasmas. Figure 1 (whose main focus is the difference between the two shots) should give the reader a sense of the hohlraum, the capsule, and the laser and radiation drive's temporal shape that were involved in this achievement.

Schematic of the hohlraum and capsule of the igntion target. The laser pulse and hohlraum drive temperature contrast Shot N210808 in black to shot N221204 in red. Reproduced from H. Abu-Shawareb et al., Phys. Rev. Lett. 132(6), 065102 (2024). Copyright 2024, American Physical Society.

Schematic of the hohlraum and capsule of the igntion target. The laser pulse and hohlraum drive temperature contrast Shot N210808 in black to shot N221204 in red. Reproduced from H. Abu-Shawareb et al. , Phys. Rev. Lett. 132 (6), 065102 (2024). Copyright 2024, American Physical Society.

When this ablatively driven rocket implosion stagnated upon itself, a central hotspot was formed. The imploding shell did compressional heating (like a piston in a car engine) in which pressure is applied to a decreasing volume (“PdV”) of the hotspot and raised its temperature to ∼5 keV. The size and density of the hotspot (characterized by the product of density and radius, the “hotspot ρR”) were sufficiently large to trap/confine the alphas produced by the DT reaction so that they could further heat the hotspot. The dense shell stagnating on this hotspot was of sufficient inertial “quality” (characterized by “total ρR”) to confine this already hot assembly, for sufficiently long time so that the thermonuclear heating wave could propagate into the dense fuel. This process resulted in even more fusion, and more heating, raising the assembly to ∼10 keV and propagating the burn further into the dense surrounding shell of DT. This process, where “Mother Nature” takes over from human efforts (namely, the PdV work of the implosion) and has the temperature “run-away” due to this thermal instability (more fusion, via alpha deposition, heats the plasma that results in even more fusion …), is, in essence, what ignition really is. At NIF scales, this process will inevitably lead to fusion yields in excess of 1 MJ. On the December 5th shot, while this high-pressure system exploded and disassembled, a total of 3.15 MJ of fusion energy was produced. 1–5  

This first time ever achievement of more fusion energy produced than the incident laser energy that entered the target, “officially” achieved the definition of ignition, as put forward in a 1997 report by a committee formed by the National Academy of Sciences (NAS). 6 This operational definition (Gain > 1) was chosen to avoid any controversies and lack of consensus over the definition as it existed at that time. The Department of Energy (DOE) and its specific section that funds this research, the National Nuclear Security Administration (NNSA), adopted this criterion. The tale of achieving a goal that the inertial confinement fusion (ICF) community of researchers had sought for over 50 years is a fascinating journey of ups and downs and the amazing perseverance of its adherents and practitioners. It is only now, after ignition had been achieved, that the Inertial Fusion Energy (IFE) community has had the full impetus to pursue high gain target strategies, high efficiency and high rep-rate lasers, and all of the other components necessary to fill out the IFE portfolio.

This paper will review the key steps along the way that led to this historic achievement of ignition. It will be told in the first-person voice, as I have been an eyewitness and participant in this journey for nearly 50 years, since the mid-1970s. Throughout this exposition, I will supplement the technical material with personal anecdotes that relate to the events and personalities involved in this long story. This paper will not be a detailed description of every avenue that was pursued, but will focus on those steps that pushed the process along.

In Sec. II , I will explore the deep roots of the field of ICF from the early 1950s and show the commonality of its roots with the magnetic fusion energy (MFE) efforts. In Sec. III , I discuss the work of John Nuckolls and colleagues at LLNL in formulating the ICF problem/challenge in the late 1950s, pre-dating the invention of the laser. I will also discuss the early laser work at LLNL. In Sec. IV , I discuss important strategic changes in direction in response to hard earned lessons learned from these earlier laser systems. One outgrowth of these changes was the invention of the field of high energy density physics (HEDP). I also discuss the motivations that led to the proposal to build the NIF and the specifics therein. In Sec. V , I discuss the efforts to get the NIF approved, in the 90s, and the issues involved in its actual construction. In addition, I discuss important work done in the 2000s to learn new physics and to prepare crucial new experimental platforms in preparation for the NIF completion and the onset of the National Ignition Campaign (NIC), a formal program that was to expire in 2012.

In Sec. VI , I describe a series of surprises and challenges that were uncovered as the NIF started its ignition attempts. Some solutions and eventual explanations for these surprising results will also be discussed. In Sec. VII , I present how the program utilized, what I like to call, “ICF's superpower,” namely, its ability to change, adapt, and innovate due to its inherent flexibility. These changes in direction led to steady improvements in target performance. In Sec. VIII , I describe how these efforts culminated in ignition. Section IX briefly covers future directions, and Sec. X looks back at all of this and suggests “lessons learned.”

I choose to begin this recounting of the history of ICF ignition by looking at events in the very early 1950s. (Coincidentally, that is when my personal history has its early start, as I was born in 1951). In reaction to the 1949 Soviet Union's demonstration of an atom bomb, U.S. President Truman gave the go-ahead to develop a more powerful hydrogen bomb. At Los Alamos, New Mexico, the single U.S. nuclear weapons design laboratory of that time (now called Los Alamos National Laboratory [LANL]), no one had a sure-fire way of designing one. The great physicist John Archibald Wheeler, who, with Niels Bohr had earlier made important contributions to understanding which isotopes of uranium fissioned, wished to make contributions to the H-bomb effort as well. He received permission from Los Alamos to set up a “second lab” at his home institution, Princeton University, in Princeton, New Jersey, in order to so contribute.

A recent paper by Chadwick et al. 7 delves even deeper into the roots of DT fusion and traces it back to work in the late 30s and early 40s. In the late 30s, A. J. Ruhlig from the University of Michigan published work on DD reactions. Since tritium is produced in half of those reactions, a secondary DT reaction can ensue. The observation of high energy neutrons implied a very high cross section for the DT reaction. The paper argues that Emil Konopinski was aware of this result and brought it up in the initial discussions in the Manhattan Project that considered a hydrogen/fusion bomb. The paper then traces the creation of some tritium at Berkeley by Emilio Segre and colleagues at the specific request of J. Robert Oppenheimer and Hans Bethe. That tritium was sent to Purdue University where the very first cross sections of the DT reaction were measured in 1943. They showed the reactivity was ∼100× that of DD. This result gave great impetus to the H-bomb effort.

So, back to the 1950s. Wheeler sent Lyman Spitzer to Los Alamos to collect data upon which the nascent effort at Princeton would base its efforts. In those days, cross country travel was by train, and Spitzer got off the train in Aspen Colorado to indulge in his favorite hobby, snow skiing. While on the chair lift, he noticed how the cable wires twisted and was seized with an “eureka” moment concerning how to confine particles in a toroidal geometry that were drifting upward due to a grad B drift. Twist the B field lines so that the particles' drift's “starting point” would alternate between the top of the torus to the bottom. From the bottom of the torus, they would drift “up” into the center of the plasma, and thus be confined. This was the idea behind the stellarator. So enamored with this idea was Lyman, that he got right back on the train and returned to Princeton, having never made it to Los Alamos.

As a result, Wheeler's effort at Princeton bifurcated. Matterhorn-S became Spitzer's stellarator project, which evolved into the present-day Princeton Plasma Physics Laboratory (PPPL). When I was a graduate student at PPPL in the early 70s, preprints were still being numbered as “MATT-nnnn.” The other half of Wheeler's effort, Matterhorn-B, remained the original project to help with Los Alamos' H-bomb developments. These developments are described by an active participant in this latter effort, Ken Ford, a graduate student of Wheeler, who also helped Wheeler write his autobiography Geons, Black Holes, and Quantum Foams: A Life in Physics . 8 Ford also wrote a much shorter version for Physics Today . 9  

The Matterhorn-B project was relatively short lived. During that short time, it was evaluating a more difficult route to a fusion device. The Teller–Ulam scheme was invented at Los Alamos in which a “primary,” namely, a fission driven atom bomb, emits a robust radiative output, which would be trapped in a large hohlraum, and drive a “secondary,” a separate hydrogen fusion bomb. This concept was successfully tested and thereby obviated the immediate need for a second lab.

There has been much said about the Teller–Ulam invention. Given that it has been declassified, the carefully worded (pre-declassification) published descriptions 10 of it can clearly be understood now. Ulam had the idea of separate devices, and Teller adapted it and improved upon it, specifically with regard to radiation as the instrument of energy transfer. I recall, in the early 1990s, being in Edward Teller's office at LLNL. While I do not recall the specific reason for that particular visit, I used the opportunity to question Dr. Teller as to his opinion about another controversy: The question as to whether Heisenberg had deliberately slowed down the Nazi effort to produce an atom (fission) bomb, or had he simply “missed the (technical) boat,” and not understood that a compact device could achieve what it eventually did in the Manhattan Project. In retrospect, it was no surprise, that Teller vigorously defended Heisenberg, and insisted that he had purposely sabotaged the Nazi efforts. It was no surprise because Heisenberg had been Teller's thesis advisor, and, as such, there was a special bond between the two of them.

While speaking to Teller on this subject, we were interrupted by a phone call. It was a reporter (from Time magazine, I believe) who wanted to hear, first hand, from Teller as to who should take priority on the Teller–Ulam invention. Teller replied as to how he, Teller, had really nurtured the idea from concept into reality. In that sense, some might say that Ulam was the “Father of the H-bomb” and Teller was the “Mother of the H bomb.” Since the iron curtain had recently fallen, Teller added the following remark to wrap-up the interview. Alluding to the fact that Stan Ulam was a Jewish refugee from the Nazis from Poland, and that Teller was a Jewish refugee from the Nazis from Hungary, and that both countries had, until recently, been behind the iron curtain with curtailed freedoms, Teller remarked that the two countries were now “free” to go to war with each other over who should get credit for the invention!

Two prominent designers of the first successful test of the Teller–Ulam invention were both graduate students at the University of Chicago, who worked at Los Alamos in the summers: Richard Garwin and Marshall Rosenbluth, both now recognized as giants in their fields. The second reason for the short duration of Matterhorn-B was that Edward Teller and E. O. Lawrence wished to pursue further ideas on the subject, so they started a rival weapon design lab at Livermore, California, which is now LLNL. For quite a long time, the Lawrence Berkeley Lab (where Lawrence built his cyclotrons) was known as site 100, while Livermore was called site 200. An explosive testing facility even further to the east, near Tracy, CA, was established, known to this day as site 300. Some of Wheeler's crew went West to join that Livermore effort, thereby diluting the Princeton effort and accelerating its demise.

In my career, I was privileged to interact with both Dick Garwin and Marshall Rosenbluth.

I met Marshall rather early in my career, as he was one of four (late) professors teaching the first year graduate course in plasma physics at Princeton, the others being Harold Furth, Paul Rutherford, and John Dawson. Marshall's thesis advisor at Chicago was Edward Teller, and his long list of contributions to the field throughout his illustrious career earned him the sobriquet of “the pope of plasma physics.” Once, during my graduate years, Marshall gave a seminar at PPPL's theory wing, and “temporarily” put his pipe into his jacket pocket while he spoke. Probably none of the attendees can remember the specific topic, since we were all mesmerized by the column of smoke rising from his pocket throughout his lecture.

In the 1990s, Marshall was part of the advisory group reviewing our ICF research progress in the context of giving as go-ahead for the NIF (all of which will be described later in this report). When I described the state of the laser heated gold walls of the hohlraum (whose low density blowoff reached a temperature of several kiloelectron volt) as being stripped of 51 of its 79 electrons, leaving 28, thus creating a “nickel-like” ionic state, Marshall remarked: “well that's your problem right there, Mordy; you're trying to turn gold into nickel, when you should be trying, like the ancient alchemists, to turn nickel into gold!.” In any event, Marshall was a great friend of ICF, and we owe him much gratitude.

I met Dick Garwin much later in life, though his reputation preceded him. His thesis advisor at Chicago was Enrico Fermi, who is reported to have said that Dick was the only true genius that he had ever met. 11 This is extraordinary high praise from Fermi, who certainly interacted with John Von Neumann throughout the Manhattan Project! Dick, operating from a base of being an IBM fellow, has had a distinguished career of advising the U.S. government on a variety of breakthrough technologies. Like Marshall, Dick was a constant fixture at the JASON group who advises the U.S. government on technical issues. It was composed of academics who could spare the time in the summers, hence the name JASON: July, August, September, October, November. The group was founded by John Wheeler, who had wished to form a permanent group as an outgrowth from Matterhorn-B, but he settled for a summer study group. At the JASON meetings, Dick had a legendary modus-operandi, in which he would quickly rifle through the presenters' view-graphs (back in the day when they were plastic foils, not powerpoint slides) before the talk was to be given, and that would suffice—it was rare that he would feel the need to sit through the presentation. Imagine my surprise then, in the mid-2000s, when I was briefing the JASON group on my contribution to understanding energy balance in nuclear events, that Dick sat through the entirety of my presentation. He asked probing questions that I was lucky enough to have thought of and thus had prepared answers for all of them. Freeman Dyson, another JASON member later remarked to me how it was the most interesting piece of physics in the nuclear weapons realm that he had heard in over 40 years. The next day Dick pulled me into a side room and on the white board sketched a completely independent way of thinking about the problem and rederived my results. Indeed, that is a good description of his genius.

In the early to mid-2000s, both LANL and LLNL were still jointly run by the University of California (UC). As such, oversight committees appointed by UC operated to evaluate the quality of science being done at each lab. I was appointed chair of such a committee evaluating the physics department of LANL. It is composed of an impressively diverse set of activities, including ICF (my connection to this enterprise) but high energy particle physics, nuclear physics, and bio-physics as well. The oversight committee was truly an all-star team from these many fields, and it included the late Stuart Freedman, the experimental nuclear physicist from the Lawrence Berkeley Lab, whose experiment with John Clauser on Bell's inequality led to the latter's Nobel Prize in 2022, and Andy Vterbi, the visionary technologist who co-founded the cellular giant, Qualcomm Inc. Also on the committee was Dick Garwin. Dick spent most of his time, it seemed, busy answering emails during all of the presentations. However, I will never forget the time a LANL researcher was presenting his work on brain wave imaging using a wired helmet, for review by our committee, and mentioned the fact that he was having difficulties getting a good signal. Dick looked up from his emails, asked the presenter to project his circuit diagram. Immediately Dick pointed to a place on the diagram where he said the researcher needed to add a pre-amp and then went back to his emails. Of course, Dick proved to be correct!

Despite its relatively short duration of operation, the Matterhorn-B effort did have a lasting impact. A construct that has lasted to this day, from that effort, is the so-called “Wheeler Diagrams.” Their purpose is to delineate a path in parameter space on which the dT/dt of the system stays positive, allowing the thermal instability we call ignition to occur. Early examples of this, applied to ICF, appear in Kirpatrick and Wheeler, 12 where the y-axis was T and the x-axis was density. Later M. Widner (Ref. 11 of our Ref. 12 ) changed the x-axis to the density radius product ρR. In Fig. 2(a) , we show such diagrams as they appear in John Lindl's review article 13 some 45 years later, as well as in Fig. 2(b) , in a more recent article by Annie Kritcher and co-workers 148 showing the system trajectories of non-igniting to nearly or fully igniting targets.

“Wheeler Diagrams” showing trajectories in a phase space of fusing systems, with axes T and rho-R. (a) Reproduced from J. Lindl, Phys. Plasmas 2(11), 3933 (1995) with the permission of AIP Publishing. (b) Reproduced from A. Kritcher et al., Phys. Rev. E 106, 025201 (2022), an open access publication of the American Physical Society.

“Wheeler Diagrams” showing trajectories in a phase space of fusing systems, with axes T and rho-R. (a) Reproduced from J. Lindl, Phys. Plasmas 2 (11), 3933 (1995) with the permission of AIP Publishing. (b) Reproduced from A. Kritcher et al. , Phys. Rev. E 106 , 025201 (2022), an open access publication of the American Physical Society.

In that same Lindl review article, he presents my results for an analytic solution to a non-linear differential equation that traces a path through the Wheeler Diagram space, of T vs ρR, in the regime early in the implosion when PdV heating dominates, and electron conduction is the dominant cooling mechanism. This formula, T ∼ (ρR v imp ) 2/5 trajectory is a “stable attractor” in this phase space. Less than a week after I came up with this solution (and I did not particularly spread the news of this result around), I was quite surprised to get a phone call from Dr. Teller's office. He had to respond to an inquiry from DoE headquarters on the distinctions of ICF ignition from the workings of a nuclear weapon. I went to his office and showed him this result, which he subsequently used in his successful defense of keeping ICF an open avenue of inquiry and research. How Teller knew that I had just come up with this formula, I'll never know.

In Fig. 3 , we show a “class photo” from Matterhorn-B, as shown in Ref. 9 . One notes the “climber's rope” on the left in row 2, a homage to scaling the Matterhorn. Wheeler and Ford are on the right of row 2. Some other prominent alumni are in the top row. On the left is David Layzer, who was an astrophysicist at Harvard and who did work relevant to ICF regarding the Rayleigh–Taylor instability (RTI). Fourth from the left is Edward Frieman. When I was a graduate student at PPPL in the early to mid-70s, Ed was my second-year advisor and later a reader of my PhD thesis, and served as PPPL's deputy director. He later went on to head the Office of Science at DOE, and after that was the Director of the Scripps Institute in San Diego. His contributions to National Security include his work on numerous very high-level advisory committees on technical projects, arms control, and climate impact, including a leadership role with the JASON group.

The staff of Matterhorn B, ca. 1952. Photo appeared in “John Wheeler's work on particles, nuclei, and weapons” by Kenneth Ford. For further annotation, see text. Reproduced from K. Ford, Phys. Today 62(4), 29–33 (2009) with the permission of AIP Publishing.

The staff of Matterhorn B, ca. 1952. Photo appeared in “John Wheeler's work on particles, nuclei, and weapons” by Kenneth Ford. For further annotation, see text. Reproduced from K. Ford, Phys. Today 62 (4), 29–33 (2009) with the permission of AIP Publishing.

In that same top row, second from the right, is an individual only labeled as “unidentified,” as his head is partially obscured by Wheeler's. I can identify that person as Carl Haussman. Shortly after that photo was taken, Carl was one of those who left the project and joined up at the newly established Livermore lab.

Carl went on to have a distinguished career at LLNL, and he retired from there as a deputy director at large. One of Carl's most impactful contributions was to lead a group of designers who came up with a fundamental breakthrough in weapons design. This was one of two breakthroughs (the other by Johnny Foster) that allowed the lab to design a working system to fit on the Polaris submarine launched missile, and thereby ensuring deterrence for many decades. When Edward Teller first announced that LLNL could accomplish this (at the time) seemingly impossible goal, there was great skepticism that it could be done. Those two basic contributions have impacted all systems in the current U.S. stockpile. The success on the Polaris project firmly established the reputation of LLNL. To some degree, this history of achieving ignition on the NIF, despite its many doubters and skeptics, has analogies to this Polaris story. In later years, Carl paid attention to the environmental state of the LLNL's physical layout, and, to this day, at the center of LLNL is a lake, surrounded by a natural eco-system, named after him.

How can I be sure that Carl is a partially obscured individual? Well, in 1995, I gave a Physics Colloquium at the Princeton University Physics Department in Jadwin Hall, speaking about our plans for the not yet built NIF. Before the seminar, I met with John Wheeler, then in his mid-80s, who asked after the welfare of Carl. He remarked that I should give his regards to Carl, with a quote that “Carl was a good boy.” By chance, a week later at LLNL, I ran into Carl and gave him the regards. He, near his retirement, was rather amused at being called a “boy”! A few years later, Carl passed away, preceding Wheeler in death by a decade.

The “tradition” started by Carl, in the early 50s, of migrating from PPPL to LLNL was greatly reinforced in the early 70s when the laser fusion effort at LLNL was just getting off the ground.

1960s: S. Bodner, D. Forslund, B. Langdon, W. Kruer

1970s: C. Max, J. Lindl, E. Williams, M. Rosen, M. True

1980s: R. Chrien, C. Barnes, D. Ho, D. Meyerhofer, T. Murphy, C. Keane, P. Beiersdorfer

1990s: D. Ward, D. Roberts, M. Herrmann, H. Herrmann, R. Heeter

2000s: M. Karasik, S. Hsu, Y. Ping, D. Clark, A. Sefkow, L. Berzak Hopkins

2010s: J. Kallman, L. Petersen, J. Baumgaertel, P. Schmit, J. Mitrani, S. Davidovits, Y. Shi

This list is not meant to exclude many other Princeton University graduates from other departments who have also made large contributions to the ICF program, nor, of course, is it meant to exclude the many other outstanding institutions of higher learning which have contributed talented and dedicated staff from which the ICF Program has benefited enormously.

In the late 1950s, John Nuckolls of LLNL considered how to apply weapon research to civilian applications. Starting from the above-mentioned Teller–Ulam scheme, clearly two major changes needed to be made to make that happen. First, the fission primary needed to be eliminated altogether, and be replaced by a non-nuclear “driver” of some kind that would heat the hohlraum. Second, the hydrogen/fusion secondary would have to be considerably reduced in size to allow for containable, sustainable fusion outputs, suitable for a civilian power plant.

Nuckolls 14 envisioned, for example, a megajoule class, particle beam entering a ∼1 cm size hohlraum and imploding a radiation driven, several millimeter size, DT capsule. These choices were remarkably prescient, given that these are precisely the scales of driver, hohraum, and capsule, as discussed in the introduction, that achieved ignition on the NIF some 65 years later. Moreover, that vision predated the invention of the laser by ∼1/2 a decade.

As will be described below, I was hired by John Nuckolls and worked for him for many decades. In my life as a physicist, I have encountered many extraordinary and brilliant colleagues. However, I define “genius” as someone who thinks about things in completely divergent ways. I have encountered two: Dick Garwin and John Nuckolls. John's actual achievements, as well as some of the mind-blowing ideas that I have heard from him (almost all, classified) qualify John for that description. I recall 1 long week, in which I interacted and collaborated with John on a specific ICF piece of physics, caused my head to ache. Every day, John would explain how he got to a certain result in a surprising and unique way, and I would spend the rest of the day trying to reconstruct that result using “normal” physics thinking. A week of such struggles was all my brain could take.

Once the laser was invented, it was clear that research should be pursued that considered it as the driver of choice for ICF. Nascent efforts at laser building at LLNL began, under the tutelage of such pioneers as Ray Kidder, Sterling Colgate, and Yu-li Pan. As the efforts became more serious, Director Mike May created a laser directorate and, as mentioned above, named Carl Hausman as LLNL's first associate director for lasers.

One of Carl's lasting contributions to the development of the field was to go out and hire John Emmet from the Naval Research Lab (NRL). John was a powerhouse of a laser builder, and he created a world class team of laser expertise around him, including Bill Krupke, John Trenholme, John Murray, John Holzrichter, John Hunt, Jeff Paisner, Abe Szoke, Julius Goldhar, Paul Wegner, and Jack Campbell. The team of Nuckolls and Emmet led a tw o- decade effort of unprecedented progress and growth in building ICF strategies and target designs along with their concomitant lasers.

In 1972, the field of ICF emerged from the shadows of the nuclear weapon design world with the publication 15 of the Nature paper by Nuckolls, Wood, Thiessen, and Zimmerman. The key concept revealed in that paper was the necessity for pulse shaping the drive so that the DT fuel would stay as cold as possible, even as the pressure on it built up. This would help the fuel get as dense as possible, thus minimizing the required driver energy. To wit, it meant that the fuel would follow, as closely as possible, the Fermi Degenerate isentrope, P FD  = kρ 5/3 , where ρ is the density of the fuel, and P FD is its minimum (“quantum”) pressure. We define α = P/P FD , where P is the actual pressure of the fuel. Thus, α = 1 represents the best one can do, and a value greater than unity represents a surrender to practicalities that might sometimes be required. In this story of the road to ignition, we will see many important instances of such compromises along the way.

The reason for the need for very high fuel densities is (at least) twofold:

First: To ignite the hotspot in the implosion, we need the fuel to have a requisite ion temperature, T i , of order 5 keV, (and for a dense enough hotspot T e and T i are nearly equal) to get the fusion rate going robustly, and a density-radius (ρR) product of at least 0.3 g/cm 2 in order to stop the alphas produced by those reactions, so that they may further heat the hotspot fuel to ignition. Thus, the energy of the hotspot E HS  ∼ M HS T ∼ ρR 3 T ∼ (ρR) 3 T/ρ 2 . The numerator, as just explained, must be at least a fixed amount, (0.3) 3 (5). Thus, the leverage is in the denominator to be as large as possible to minimize the energy needed to ignite the hotspot. Minimizing that energy is proportional to minimizing the size/energy of the driver, and thus minimizes cost.

Second: The basic confinement of ICF is inertial: the hot and burning imploded core will disassemble on a timescale of its final radius, R, divided by a sound speed. 16 To maximize yield, the fusion rate (proportional to ρ) must be fast compared to that timescale. In short, we seek to maximize the total ρR of the system to achieve a good burn-up of the fusion fuel and a high efficiency output. At a fixed target mass, M, we have a compressed sphere's R ∼ (M/ρ) 1/3 , so ρR ∼ M 1/3 ρ 2/3 , so again, a large ρ is needed for good ICF performance.

The Nature paper invoked the results of the two-dimensional (2D) hydrodynamics code Lasnex. My colleague (to this day), George Zimmerman, was the creator of this code. In preparation for that paper, there were many simulations run to optimize the target performance. The printout from each simulation run would also report on the amount of time the problem took to complete. One early designer mistook that “timing” printout as a report of the yield, so he optimized on that rather than yield, until George set him straight. I consider George a rare “National Treasure.” His abilities to integrate diverse fields of physics and to understand each field so deeply and fundamentally is off the charts. I measure my growth as a physicist by the degree to which I increasingly come to appreciate George's greatness.

The Nature paper assumed a simple “bare drop” of DT illuminated directly by the laser. This would clearly have simplicity on its side, a trait one would like to see in a power reactor. With much more refined analysis, it became clear that Mother Nature would not be so kind, on at least two counts.

First, the very high intensity driving this design at its peak was of order 10 17  W/cm 2 and was thus determined later to be subject to laser plasma instabilities (LPI). 17,18 These would compromise the coupling to the target, as well as be the source of hot electrons that would penetrate deep into the target and preheat the fuel. Such preheat would negate the whole idea of the pulse shaping which was to keep that DT fuel as cold as possible. Some of the early researchers involved in these insights were Bill Kruer, John DeGroot, and Jonathan Katz. Specific concerns of the non-linear nature of these LPI effects were published by the LLNL group, 19 as well as by the exceptionally talented LPI group at LANL. 20  

Second, ablation driven rockets, also known as ICF target implosions, have the low density ablated material accelerate the dense shell. Thus, there is an effective “gravity,” g, pointing from the dense shell into the lower density gas. This is equivalent to an inverted glass of water. In principle, air pressure should keep the water in the glass. However, that pressure equilibrium is an unstable one, because the system can lower its energy in the gravity field by exchanging dense upper fluid with low density air lower fluid. It is a classic example of the Rayleigh–Taylor instability (RTI). This instability can be mitigated somewhat by the ablation process itself. The ablative stabilization formula used in the Nature paper, attributed to a “work-in-progress” talk by Chuck Leith, was determined later to be inaccurate. 13,21 Given the technology at that time, direct drive laser intensity profiles were quite non-uniform, which would provide deadly initial perturbations that would grow due to the RTI, and eventually completely break up the imploding shell and ruin the implosion.

We will see throughout this paper, that this one-two punch of LPI and hydrodynamic instabilities will form the “Scylla and Charybdis” through which the good ship ICF must safely sail to attain success. These two constraints have persisted throughout this long journey to ignition. In the early days then, the decision was made to shift the program from direct drive to indirect drive, (again) for at least two reasons. First, the capsule inside the hohlraum is driven by the x rays. It did not matter (too much) that the laser was non-uniform as it hit the walls of the hohlraum. Two adjacent points on the capsule would look out at their “sky” and see a “mess,” but, crucially, if they are sufficiently close to one another, they would see the same mess , and thus they would be driven equally. Second, x rays reach deeper into the capsule, and thus ablatively stabilize the RTI much better than direct drive. 16  

The LLNL laser/ICF program was building lasers throughout the period when this reassessment of the target/drive choices were made. In 1975, the Cyclops laser was a one-beam prototype of what was to be the 20-beam Shiva laser, conceived as a direct drive implosion facility. This was the first example of many, on this journey, wherein a multi-beam “mega facility” would quite sensibly, have its technology tested on a single prototype beam. Also in that year, the Janus facility (like its two faced mythological namesake) had two beamlines and produced the first thermonuclear burn products using DT, and in an indirect drive geometry. Much of the details about these early days of indirect drive at LLNL can be found in John Lindl's review article, 13 while a comprehensive review of early direct drive efforts around the world can be found in the review article by Steve Craxton and colleagues from the University of Rochester Laboratory for Laser Energetics (URLLE). 22  

I arrived at LLNL in 1976. Prior to that, I was pursuing my graduate degree at PPPL. I had outstanding teachers such as Marshall Rosenbluth, John Dawson, Harold Furth, Tom Stix, Carl Oberman, Rip Perkins, Ed Frieman, and Miklos Porkolab. When Ed Frieman would lecture, he would nimbly and consistently interchange a long white cigarette on one hand, and a long white piece of chalk on the other, and amazingly never confused the two. I had gracious and generous advisors in Ed Frieman and John Greene. Most of all, I had extremely impressive fellow graduate students. Rob Goldston went on to be PPPL director. Ned Sauthoff eventually became the head of the U.S. part of the international tokamak project, ITER. Steve Jardin has had a stunning career in computational plasma physics at PPPL. Earl Marmar and Adil Hassam went on to distinguished careers at MIT and the University of Maryland, respectively. My second-year research project found a “vertical” (namely, “n = 0,” or axisymmetric) “instability” in tokamaks. For any poloidal shape, there was a perturbation (some combination of “m” modes) that would grow. My classmates termed it “the Rosen wrinkle.” 23  

As satisfied as I was with my theory work, I felt that, as a theorist, I would have minimal impact on the promising future that new tokamaks would lead to. I feared that I might go an entire career and not change a single screw on a proposed tokamak. After graduating, I interviewed at the LLNL laser fusion program. Earlier, the LLNL ICF Program's Claire Max had come by PPPL to recruit. I was excited by the possibility that as a target designer, I could put “a new tokamak” in front of the high-tech and expensive element of the ICF Program, namely, the laser, every single day. This insight has served me well throughout my career and has kept my work fresh, diverse, and exciting for nearly 50 years. Throughout this description of the path to ignition, I will invoke this notion of “ICF's superpower.” By that I mean the possibility and flexibility of innovating, and the ability to react to experimental challenges and disappointments with a change of target and approach.

By the time I arrived at LLNL in 1976, the Argus facility was firing shots. It was also two-beam and started using what was then modern and novel technologies, such as a pinhole component to smooth out laser beam intensity profile irregularities. To azimuthally symmetrize the indirect drive on targets illuminated by Argus, the target used a scattering cone. In principle, this would have worked just fine. In practice, however, high irradiance on that cone led to LPI. I recall losses from Brillouin scattering approaching 50%. Moreover, the scattering would depend on the laser polarization, and that, combined with such LPI issues as side scattering broke the azimuthal symmetry. Our design code, Lasnex, was a 2D code, and thus not capable of accurately modeling this inherently 3D issue.

Another requirement as we stepped through lasers and tried various hohlraum configurations was the understanding of how hohlraum drive scaled. I applied a quantitative Marshak wave analysis to the problem of assessing how deep into the Au walls would the non-linear radiatively driven heat wave penetrate. This then would determine how much mass was heated in a given time, and thus the drive temperature of the hohlraum walls that would bathe the capsule and implode it. This analysis correctly predicted hohlraum drive from those early days through to NIF ignition. 24,25 For this application, I found it convenient to invent a new system of units, “r.h.u.” radiation hohlraum units, whose constituents included time in nanosecond (ns), distance in mm, energy in hecta Joules (hJ, meaning hundreds of Joules), and temperature in hecta electron volts (heV). In these units, the basic black body radiated, per unit area, as σT 4 , with σ conveniently at a value very close to unity. Hohlraums to this day are 1–3 heV, so that too makes heV the unit of choice. My colleague at the time, Roger Bangerter, who is a pioneer in the concept of heavy ion driven ICF, insisted on calling “r.h.u.” “Rosen's Hebrew Units,” and heV, “Hebrew electron volts.”

At the time all of this was classified, as it would hint at the Teller–Ulam scheme. The work was published in classified annual reports of the LLNL Laser Program. Only over a decade later, when other research efforts throughout the world started publishing work along these lines, would this early work of mine see the light of day. In addition to Refs. 24 and 25 which came later, major sections of Ref. 13 describe this work, as do the works of Kaufman et al. 26 and Suter et al. 27  

The term Marshak wave comes from the work of Robert E. Marshak, which he performed 28 during the Manhattan Project. It is a non-linear heat wave since the heat conductivity (in our case, photon driven) is not simply a constant that depends on the material in question, but rather depends on the temperature. Marshak was a thesis student of Hans Bethe (on white dwarf stars) and served as his deputy at Los Alamos during the Manhattan Project. After the war, he did important research, at the University of Rochester, in formulating the weak interaction approach in high energy physics that culminated in Richard Feynman and Murray Gell-Mann getting the Nobel Prize. Two of his thesis students at Rochester were Al Simon and John Greene. Al later played an important role in laser plasma physics at the URLLE, and I always enjoyed interacting with him. John Greene went on to do MHD theory at PPPL and is the “G” of BGK (Bernstein–Greene–Kruskal) plasma waves. John was my thesis advisor. Thus, in the “Academic Family Tree” methodology, in which “thesis advisor = parent,” Marshak was my academic “grandfather,” and Bethe my “great grandfather.”

Marshak was president of the American Physical Society (APS), and died in a swimming accident in Cancun, Mexico. Sadly, I never met him. On the other hand, when I was Division Leader for ICF theory and design in the 1990s, I had the privilege one day of hosting Hans Bethe and briefing him on our plans for the NIF. Later that night Bethe gave a public lecture at UC Berkeley on the role of neutrinos in supernova explosions. Every seat in the two-story 525 seat Pimentel lecture hall was taken. I sat on the steps with my 3 young children. After the talk, I introduced them to him. I simply wanted them to meet a great man. (I don't think that at the time I was aware that he was my academic “great grandfather”). Ironically, just as I had extended the work of “grandfather” Marshak by developing it further throughout my career, later in life I extended 29 the work of “great grandfather” Bethe, by expanding on the so-called “Bethe-Feynman” formula used during the Manhattan Project.

I adapted my earliest work on Marshak waves from research notes of our former LLNL Director Mike May. While this work was close enough to being correct to explain the data on hohlraum drive that we were collecting, there were some small, but annoying (to me), inconsistencies in it. Since dE/dt must equal the divergence of a diffusive flux, F, then, as a check on proposed solutions, the spatial integral over the Energy profile must equal the temporal integral over the Flux evolution in time. The fact that the solutions I was using very nearly matched up in this way but not exactly, bothered me for many years. Some 25 years later, in 2003, Jim Hammer suggested a formal expansion in a small-ish parameter, that together we expanded in full second order, that fixed the former inconsistencies. 30 I am forever grateful to Jim for finally and fully putting my mind to rest.

In those early days, I recall having to appear and present my design results as well as my analytic work on target performance and hohlraum theory, to a weekly review board co-chaired by the then lab director, Roger Batzel, a nuclear chemist, and by the former lab Director, Mike May, a physicist and weapons designer. I was impressed by the high level of scrutiny the ICF program was being subject to, even internally by the LLNL management. I was also impressed by the LLNL tradition of giving young, new staff, such as myself at that time, high responsibilities so early in their careers, and high exposure to upper management. I was particularly touched at these director's eagerly accommodating my constraints, if those meeting conflicted with my need to take off for Jewish holidays.

One of the lessons learned from this history is the importance of the engagement of upper lab management to the health and direction of its ICF program. I think this is true throughout the history of the project and has served the program well. During the stressful times of the construction of the NIF, LLNL lab management went out of their way to accommodate the NIF team's needs in terms of support and needed personnel, to make their job somewhat easier. Certainly, in later years, I participated in regular briefings before Bill Goldstein, the previous director of LLNL before its present director, Kim Budil. Kim, as head of the weapons program was also in attendance at those Goldstein briefings, and she has been an active, creative, clear thinking, and steadfast supporter of the ICF program during her current tenure as LLNL director.

Soon after 1976, the Shiva laser was completed. It was built for direct drive, with 20 beam lines.

One of my first target design projects (supervised by Jon Larsen) was the design of the so-called “exploding pusher” targets for Shiva. An exploding pusher is a thin shell that contains the DT gas. The driver, either through electron conduction or via hot electrons, completely heats through that thin shell. The shell is now dense and hot, namely, high pressure, and explodes inward and outward. The inward going shock heats the DT gas, which then is further heated by the compression of the incoming half of the shell. High temperatures can be achieved in this scheme, though densities are not nearly as high as an ablatively driven shell system.

One of my first papers in the field of ICF was a simple model 31 for the physics processes that dominated the behavior of these exploding pushers. There was very good agreement between the model and the results of the full hydrodynamic simulations. Shortly after publication, I happened to meet Dr. Robert Dautray, who led the French ICF efforts. I was quite taken aback, when he told me that my work “was precious” to their efforts. Be that as it may, I must say that doing that work settled my career path for me. To be able to formulate a simple model and then to test it immediately with “virtual experiments” using the complex simulation codes was a shear delight. This delight has blissfully lasted for me continually, with changing applications, for the past 50 years.

There was great interest throughout the ICF community as to what yield would emerge from this first Shiva experimental campaign. As optimistic calculations suggested yields of order 10 12 neutrons, that is, where many of the guesses in this “betting pool” were made. Don Slater of KMS Fusion, a private laser fusion company in Ann Arbor, Michigan, which closed down in 1990, guessed the speed of light (in cgs), namely, 3 × 10 10 . He was the winner of the competition, as that was quite close to the experimental result. I am not sure we ever understood the reason for that underperformance of that system.

In January 1980, I and my LLNL colleagues experienced an earthshaking moment. A 5.8 magnitude earthquake caused a great deal of damage and a great deal of shaking. It turned out it was on a little-known fault, the Greenville fault that passes within 1 mile of the lab. Prior to this event, we were unaware of its existence. So when the building shook so violently, I imagined that this was actually an earthquake centered near San Francisco, some 40 miles away, and that it was truly catastrophic over there. At the time I had just bought a house in the Berkeley hills and had not yet sold my old one in the Berkeley flats. I imagined my new house rolling down the hill and crashing into my old one. Amidst the shaking, I crawled under my desk and called my wife, Rena, at home. To my relief, she did not know what I was talking about. The earthquake had not affected Berkeley (or San Francisco) at all! I went back to work at my desk when the shaking stopped, but I was advised to exit the building, along with everybody else. I was on the ground floor, but I had not known that my colleague, Judy Harte on the second floor had her entire ceiling come down on her (luckily vacant) desk.

In another part of the LLNL square mile site, Bill Kruer was just about to give a lecture on LPI in front of a full auditorium when it hit, and the auditorium was evacuated. Exactly 1 week later, when his lecture was rescheduled, an aftershock hit and the auditorium was cleared again, earning Bill quite a reputation as a “moving” speaker. In the main library, the bookshelves collapsed on each other like falling dominoes. Many lessons were learned from this event. Most involved safety features that are installed throughout the site. Some involved laser architecture and planning. The lasers needed to be built on their own floating “tables,” disconnected from the buildings they were in, and automated pointing software and hardware were needed to accurately align targets, lasers, and diagnostics. This is true to this day on the NIF.

Indirect drive experiments at Shiva, with x-ray ablated glass shells, were aimed at compressing DT to 100× of its “standard” initial density of 0.25 g/cc. This was a continuation of an earlier, failed effort, at Argus. The Shiva targets underperformed. It was assumed that LPI was creating hot electrons. These hot electrons would penetrate deep into the capsule, preheat the fuel, and thereby prevent the achievement of the sought after high density.

The roots of this difficulty on Shiva extend back to the failed attempts at achieving “100×” in the Cairn experiment on the Argus laser. My colleague Bill Mead (then at LLNL, and later at LANL) was the lead designer and the individual responsible for this campaign. An experiment was proposed to test this hot electron preheat hypothesis: 32 Instead of a full hohlraum, with two laser entrance holes (LEHs) and a capsule in its center, it would be a half-hohlraum (“halfraum”), or, for this first try in the Cairn campaign, it was called a “half-Cairn.” It is pictured in Fig. 4 . It would be a cylinder only 1/2 the length of a full one, with only one LEH. The opposite endplate would be at the position of the midplane of what would be a full hohlraum. That endplate had a hole cut into it, and a glass slab of thickness equal to the capsule shell would be attached on the inside. External diagnostics could thus view the “inside” of the capsule, by looking at the cold, undriven side of the glass slab, as it was that side that was adjacent to the hole of that endplate. Hot electron production within the hohlraum would be monitored by the “FFLEX” diagnostic that measured the hard x-ray bremsstrahlung produced when most of those hot electrons plowed into the gold wall of the halfraum. During this era, targets were expertly made by an LLNL in-house team led by Chuck Hendricks and Bill Hatcher.

An early “half-raum” used to get diagnostic access to the “inside of a capsule” by viewing the surrogate slab's undriven side through the hole in the back plate of the halfraum. Original art.

An early “half-raum” used to get diagnostic access to the “inside of a capsule” by viewing the surrogate slab's undriven side through the hole in the back plate of the halfraum. Original art.

Before we discuss the results of this experiment, let us remark on the paradigm shift represented by this halfraum. It was a departure from doing full hohlraums with a capsule imploded with it, with new implosion experiments done for each new generation of laser driver. In some sense, this modus operandi was no surprise, as the testing of nuclear devices at the Nevada Test Site (NTS) followed the same script. A new device would be placed down-hole, and an entire array of diagnostics placed above it, reporting out data at the speed of light just before they would be pulverized by the nuclear blast. Thus, the halfraum was a change that would not concentrate on yield performance, but rather on physics understanding. If successful, it would open a new era of halfraums as “physics factories” that could study a wide variety of physics issues that occur within the “high energy density physics” (HEDP) regime. The key word in the previous sentence is “if.” The preliminary data from this experiment cast doubt on the whole approach and thus threatened the birth of this new field of HEDP.

The FFLEX instrument worked quite well, and from its signal, we could infer 60 J of hot electrons at a temperature of 70 keV. The two instruments monitoring the cold side of the glass slab were a “Dante,” a time resolved, multi-channel broadband x-ray detector (though its field of view was extremely broad), and a streaked optical pyrometer, which was not only time resolved, but its field of view was also highly localized. They both gave a very early signal that could be interpreted consistently as a ∼2 eV preheat signal, followed by a ∼10 eV shock breakout. Given the FFLEX results, and a measured 140 eV drive, these preheat and shock signals were in reasonable agreement with expectations. However later in time, the two diagnostics reported some very large, unexpected signals, that, moreover, diverged rather widely from each other both on maximum signal level and in their temporal behavior. In short, these late-time signals were quite large and not understood. The signals are pictured in Fig. 5 . This mystery cast a cloud over the entire enterprise of doing such “physics” experiments, as it seemed as if they raised more questions than they answered.

Left-hand side: Setup of the experiment. Reproduced from M. D. Rosen, Phys. Plasmas 3, 1803 (1996) with permission of AIP Publishing. Right-hand side: The confusing late-time signals seen by the two instruments in the first half-raum experiment. Original Art.

Left-hand side: Setup of the experiment. Reproduced from M. D. Rosen, Phys. Plasmas 3 , 1803 (1996) with permission of AIP Publishing. Right-hand side: The confusing late-time signals seen by the two instruments in the first half-raum experiment. Original Art.

I was brought in to look at these mysteries with a “fresh set of eyes.” I saw immediately what the problem was. The mindset had been that these diagnostics continued to see the cold side of the glass slab even at late time. What was forgotten was that this glass slab, like the glass shell of an imploding capsule that it was meant to represent, is a radiation ablation driven rocket. The slab could move, and indeed did move. It popped right out of the halfraum through that diagnostic hole in the halfraum's endplate, in a cookie-cutter like manner. As such, the two diagnostics, each viewing the back of the halfraum from an angle, would eventually see the hot, driven side of the glass slab as it cleared an axial position that would allow that “hot drive” to be in the line of sight for each detector. This explained their high late time signals and the difference in timing of each diagnostic given their different angles/lines-of-sight. In fact, from the time difference of their “hot side,” large signal one could deduce the velocity of that slab “cork popping” its way out of the halfrum (see Fig. 6 ).

Explanation of the signals, as the two instruments see the hot drive of the sample at different times, as it “cork-pops” out of the back plate's diagnostic hole and moves outward. Original art.

Explanation of the signals, as the two instruments see the hot drive of the sample at different times, as it “cork-pops” out of the back plate's diagnostic hole and moves outward. Original art.

With this successful explanation of all aspects of those signals, and with the additional “bonus data” of measuring the slab's x-ray ablative acceleration and motion out of the halfraum, the field of HEDP was rescued from being stillborn. We were soon measuring the motion directly by x-ray backlighting, a joint effort between LLNL and the Naval Research Laboratory (NRL). 33 Then we were soon measuring x-ray driven shock breakouts through stepped samples to measure equations of state along the Hugoniot, x-ray burn-through of high Z samples to measure opacities, and eventually (after quite a few struggles) we were able to measure hydrodynamic instability growth rates. This RTI work 34 was a design tour-de-force by Dave Munro, and an experimental achievement led by Bruce Remington. It won the APS “Excellence in Plasma Physics Award,” also known as “The John Dawson Award” in 1995, and was the first HEDP project to do so. A short summary of all of these developments stemming from this first HEDP experiment devoted to preheat measurement, as described above, can be found in my 2001 Teller Award lecture, 35 which was given at a conference in Kyoto Japan a few short days after September 11. Being stuck in Japan, with no flights leaving for a week, was a surreal experience. I will always be grateful for the care and concern that the Japanese people expressed to all of us U.S. conference attendees during this very stressful time.

This robust new field of HEDP suggested a whole body of work that could inform the high energy density physics regime found in nuclear weapons. An LLNL committee, led by Carol Alonzo, a weapon designer, was formed to flesh out this proposed body of work. I partnered with Abraham Szoke to put forward about 100 pages of proposed ideas for laser driven experiments. Abe was literally “old enough to be my father,” as he had a son older than me. He was a distinguished laser physicist 36 with expertise in other areas, such as crystallography and holography. He survived the Holocaust as a teenager in Budapest, and later escaped from the communist regime there after the war. Forty years later, well past his retirement from the lab, and well into his 80s, Abe would still come to my office nearly daily to work on projects of mutual interest. I miss him greatly.

The committee's recommendations were only partially embraced by LLNL management, who were too busy conducting full scale, underground nuclear tests at a rate of nearly one per month. Colleagues at the Aldermaston Weapons Establishment (AWE) in the UK, with a far smaller frequency of full tests, were far more enthused about HEDP and embraced this field rather whole heartedly. 37 My chief counterpart from AWE, who was also suggesting quite similar HEDP laser driven experiments, was Brian Thomas. Brian shared my penchant for simple models. It was a great comfort to know that there was at least one other person on the planet who had identical interest and goals as me. Brian is a polymath who has an encyclopedic knowledge of early blues and rock and roll music (and a record collection to match it). In his retirement, he has written an extensive two-volume history of his beloved Wales, and he is an accomplished poet. Sir Brian was knighted for his HEDP work. At LLNL, the HEDP laser driven experimental activity persisted throughout the eighties and nineties, with great support from AWE, and, as well shall see, was quite critical in making the case for the NIF in the mid-90s after the cessation of nuclear testing.

With the difficulties encountered in the Shiva experiments to achieve high imploded densities, and with the proof from the halfraum experiment that those difficulties were due to hot electrons, it was time for a change. The laser wavelength, λ, used up to that time was 1.06  μ m, the natural “color” of the Nd:glass (silica) laser slabs of the lasers built to date. It was clear that shorter wavelengths would do better, at a given irradiance I, since LPI thresholds usually trigger on the quantity Iλ. 2 A very important paper in the history of the field of ICF was Ref. 38 , a French work that showed how absorption increased with shorter wavelength. One of the early workers on LPI at LLNL, Claire Max, was on sabbatical in France and was a coauthor of this paper. Important for indirect drive was Ref. 39 , the research that showed how the efficiency of conversion of laser light to x rays also increased with shorter wavelengths. Since the critical density scales as λ −2 , a shorter wavelength laser brought energy to higher densities, whereupon electron conduction could transfer that energy to even higher densities above critical, leading to a more efficient generation of x rays from that dense and hot region.

The next question then would be how to efficiently convert the inherent 1.06  μ m light of the Nd Yag glass lasers to shorter wavelength. Pioneering work 40 by Steve Craxton and colleagues at the University of Rochester Laboratory for Laser Energetics (URLLE) showed how this could be done through non-linear conversion of wavelengths, first to green light (“2ω”) and then to ultraviolet light (3ω and 4ω) as the light passes through potassium dideuterium phosphate (KDP) crystals. This method is embraced and utilized to this day on the NIF.

This crucial decision to pursue a path to ignition using shorter wavelength laser light had major implications. The big laser planned as a follow-on to Shiva was Nova. It was conceived as a 200 kJ, 20 beam facility, configured for indirect drive. The 200 kJ was originally thought to be sufficient to have a chance at ignition. In retrospect, we now know that that guess was off by a factor of 10. However more importantly, that 200 kJ was at 1.06  μ m light, which was now recognized to be an unacceptable wavelength. This became one of those moments where the fate of ICF and the pursuit of ignition hung in the balance. The program did survive this “near death experience.” The project, now incapable of reaching ignition, was cut back to 10 beams, 120 kJ at 1.06  μ m light, and 30 kJ of 3ω, 1/3  μ m light, as it passed through the KDP crystals.

Ironically, at this same time, the LANL laser fusion program was experimenting with a CO 2 laser whose wavelength was quite long, at 10  μ m! As the reader should not be surprised by now, their experiments were chock-full of hot electrons. As LANL was also pursuing indirect drive, it must have surely been a bitter disappointment to them that the measured 41 conversion efficiency of laser light to x rays using this laser was a microscopic 2%. These very poor results quickly put the issue to rest: LANL would not be authorized to build a large laser. Instead, their technical staff was advised to help the LLNL Nova efforts in LPI theory, target design, and experiment, as well as in target fabrication techniques. My assessment is that their support in all four of these areas, as well as their independent ideas and contributions, were all indeed helpful in pushing the ICF Program along on its ultimate path to ignition.

The Novette laser was a two-beam facility that was a prototype to ten-beam Nova. It could operate at 2ω, 3ω, or 4ω light. With its green light capability, on Friday, July 13, 1984, we created the world's first extreme ultraviolet “x-ray” laser 42,43 (XRL) using exploding foils as the plasma medium, in which a Ne-like Se plasma propagated 3p-3s 206 and 210 Å laser light. An earlier target design that considered using thick (non-exploding) foils showed (in the simulations) regions of higher laser gains, but suffered from the resultant XRL beam refracting off of steep gradients. Because we had some experience with exploding foils for LPI experiments for ICF, Mike Campbell suggested we use them as lasing media. The regions of gain had lower peak gain values, since the density was lower, but did manage to let the XRL propagate down the lasing axis for a respectable distance. The idea worked and we share the patent for this approach. 44 It won the APS “Excellence in Plasma Physics Award,” also known as “The John Dawson Award” in 1990, and was the first non MFE project to do so.

This lesson of using the exploding foil, and thus retreating from a “higher gain” design to something with less potential performance, but more “stable” in some other sense, would be exactly repeated in ICF ignition target design, as we shall see below. Overall, another valuable lesson learned on Novette was that the 4ω option caused too much damage to the optics. As a result, Nova (and to this day, NIF) was chosen to operate at 3ω.

The aftermath of our x-ray laser work was an interesting and early example of the concept of “deterrence by capability,” which later formed one of the bases for embarking on the mission to achieve ignition. While the U.S. and the U.S.S.R. both pursued the “Star Wars” concept of a nuclear weapon as a pump for an x-ray laser that perhaps could shoot down incoming missiles, it is not clear as to how the Soviets assessed the U.S. progress in this field. Whatever would be reported in the popular U.S. press would likely be judged by the Soviets as their own favorite device: disinformation. The only thing that was for real was scientific peer reviewed publications later bolstered by genuine repeats of the feat by other labs across the globe. The successful work on x-ray lasing at Novette was just such an achievement. Moreover, it used a lasing scheme suggested by the Soviet scientist Vinogradov. 45 Even after the cold war was over, I was amazed at the constant interest and questions posed to me by Russian scientists on this subject. It is speculated that over-spending on this Russian version of “star wars” was the tipping point in the collapse of the U.S.S.R. If so, the Novette x-ray laser was indeed the first example of “deterrence by capability.” 46  

A second outgrowth from the x-ray laser work was the high-quality work force that it attracted to LLNL. Rich London, LLNL's first post-doc, went on to a distinguished career in target design of XRLs as well as a diverse set of HEDP experiments. Experimentalists such as Brian MacGowan and Bruce Hammel went on to become leaders of the LLNL ICF Program. Chris Keane would go on to become a leader at the DoE office in charge of the NIF Program, and he is currently Vice President for Research at Washington State University. Nino Landen is currently Deputy ICF Program leader for experiments. Jim Trebes would go on to lead the Physics Department at LLNL. The message here, relevant to the ignition quest, is that bold and challenging projects attract the best and the brightest, which in itself is a great benefit of such a high risk pursuit.

Meanwhile, the proponents of direct drive ICF knew that progress needed to be made on smoothing the inherent speckle structure of the driving lasers that could seed hydrodynamic instabilities. Many schemes were invented in the 1980s that did precisely that. 47–50 This important body of work won the APS “Excellence in Plasma Physics Award,” also known as “The John Dawson Award” in 1993, and was the first ICF specific project to do so. The field of indirect drive was happy to adapt these techniques as well, since unsmoothed beams entering a hohlraum could trigger LPI in those intense speckles 51 if they went unsmoothed.

The period of the mid-1980s through the 1990s brought changes to the LLNL Laser and ICF Program. John Nuckolls went on to become head of the Physics Department, and then Laboratory Director. John Lindl replaced him as head of the ICF theory and design effort, and eventually head of the ICF Program. Jim Davis replaced John Emmett on the laser side, and eventually he was replaced by Mike Campbell. Mike's experimental contributions and leadership to the experimental program on Nova were manifold. Nova showed that pulse shaping indeed helped to increase implosion convergence, and that the Rayleigh–Taylor instability was reduced by x-ray ablation, as predicted. Low mode implosion symmetry could be tuned by varying the laser pointing. The hohlraums could reach the predicted high temperatures (250–300 eV). These hohlraums were illuminated by relatively short pulses, and thus could be empty. Under those conditions, LPI was rather minimal.

Meanwhile, there was a robust program using the copious energy generated in underground nuclear explosive tests to explore the fundamentals of the ICF strategy. Both LLNL and Los Alamos (LANL) participated in this “Halite/Centurion” program. Experimental managers from the LLNL's ICF program such as Hal Ahlstrom and Erik Storm were very much involved in this endeavor. Its details remain classified. What is true is that the results from this program “demonstrated excellent performance, putting to rest fundamental questions about the basic feasibility to achieve high gain.”

Thus, from an energy driver point of view, with Nova low, and Halite/Centurion high, it seemed as though the ICF program had the problem of achieving ignition “surrounded.” Flush with these successes, LLNL proposed to build a facility, the “Laboratory Micro-fusion Facility” (LMF) that would produce yields in excess of 100 MJ, a quantity of interest to the weapons program, and, if successful, could provide useful information for Inertial Fusion Energy (IFE) civilian power production efforts. Lindl and co-workers estimated that the LMF would need to be of order 10 MJ of 3ω light, a factor of 300 greater than Nova, to ensure ignition and propagating burn. The target would have a Be ablator and operate in a large 250 eV hohlraum.

The National Research Council and other trusted outside experts advised against this large, 300×, extrapolation from Nova. Lindl and co-workers heeded this advice and devised a riskier target operating in a 300 eV hohlraum that could achieve ignition at 1 MJ. Thus, a 1.8 MJ, 500 TW facility was proposed that would have a “margin” of a factor of nearly two. This, of course, was the NIF. The operating space (in power, energy coordinates) is shown in Fig. 7 . The constraints in power requirements on the sides of this “bird's peak” operating space were determined by LPI constraints from above, and hydrodynamic instability growth from below. The factor of ∼2 margin is present along the diagonal, but there was no guarantee that the sides would not collapse into the operating space, squeezing the successful target space up and to the right to nearly have no margin left. In retrospect this is essentially what happened on the NIF on the way to ignition using every-last-ounce-of-energy (and then some) from the laser. Once again, the navigation between the Scylla and Charybdis of LPI and hydrodynamic instabilities proved treacherous indeed.

The operating space for the NIF (power vs energy) and the perceived operating space for the ignition target point design. Reproduced from J. Lindl, Phys. Plasmas 2(11), 3933 (1995) with the permission of AIP Publishing.

The operating space for the NIF (power vs energy) and the perceived operating space for the ignition target point design. Reproduced from J. Lindl, Phys. Plasmas 2 (11), 3933 (1995) with the permission of AIP Publishing.

It is somewhat ironic, that now that ignition has been achieved, folks with no knowledge or memory of these late 1980s developments ask the question: “Well, why did not you just ask for a 10 MJ driver in the first place?” I hope the above description answers this question. Had the ICF program been stubbornly insistent on the “sure bet” (from a target performance point of view) 10 MJ facility, we might never have gotten funding for it and might still be waiting for ignition in the laboratory. Though the 10 MJ, high gain and very high yield facility for both weapons physics studies and for IFE was not authorized, it was always conceived as the next step, once NIF had demonstrated ignition and moderate gain.

A key ingredient in getting the NIF approved was to fulfill the “Nova Technical Contract” (NTC). This constituted a dozen milestones that demonstrated good performance on Nova, of relevance to the proposed NIF ignition target. About a half dozen involved hohlraum drive, symmetry, and LPI issues. The other half dozen involved implosion convergence and growth rates of hydrodynamic instabilities, and mix occurring in the imploded core as a result. The detailed list can be found in the appendix of a later review paper by Lindl and co-workers. 52  

In 1990, I was named head of LLNL's X-division, succeeding John Lindl who had succeeded John Nuckolls. This division was made up of components such as LPI basic theory and code development, hydrodynamic design code development, and several groups of target designers. I have no illusions of grandeur about why I was chosen to lead the Division. The design group leaders in charge of hohlraum physics and capsule design, Larry Suter and Steve Haan, respectively, were far too valuable to the future efforts of achieving ignition on the NIF. They could not be spared, and certainly not be subject to, the inevitable burdens and distractions from technical work of Division management. Another group, originally led by Roger Bangerter, and then Max Tabak, dealt with other driver technologies such as pulsed power and heavy ion driver ICF research. My own design group, which had evolved from HEDP to x-ray lasing, to ultra-short pulse laser physics and extreme ultraviolet (EUV) x-ray lithography source design and optimization could and would be ably led by my replacement, Rich London. Thus, I was available and the logical choice for X-Division leadership.

As head of the ICF target design, code development, and basic theory effort in the 1990s, I was responsible for all of the work from that end supporting the achievement of these 12 NTC milestones. Joe Kilkenny was my counterpart responsible for his team that was executing the experiments and innovating and fielding the diagnostics associated with them. Colleagues at LANL were very active participants in design, target fabrication, and experimental efforts in these campaigns. Colleagues from Sandia National Lab (SNL) contributed their technological expertise in diagnostic development.

The progress on fulfilling this NTC was monitored every 3 months, first by the ICF Advisory Committee (“ICFAC”) chaired by Venkatesh (“Venky”) Narayanamurti (who has served as Dean of Engineering, first at UC Santa Barbara, and then at Harvard) and later by the National Academy of Sciences (NAS), specially appointed NIF review committee co-chaired by Steve Koonin (then Provost of Cal Tech, and later Under Secretary for Science at DoE) and Hermann Grunder (former Jefferson National Accelerator Facility director, and then director of Argonne National Lab). This was an intensely stressful process for the staff, as it seemed like the best outcome one could hope for every 3 months was the chance to come back 3 months later and do it again. Failure at any point meant cancelation of the project.

During this intense period of reviews every 3 months, I ran into LLNL former director Mike May, who, as described above, was the force behind the weekly reviews of ICF by upper LLNL management in the mid-70s. Mike asked me how I was doing, and I complained about the heavy load of reviews. I was taken aback by the vehemence of Mike's reaction. He forcefully asserted that I need to appreciate the review process. To paraphrase his remarks: “Smart people are devoting their most precious resource, their time, to listening to you about your program and its problems. This is a gift and a privilege, not a burden.” I took this rebuke to heart, and to this day have adjusted my attitude toward reviews as an opportunity, not a burden. What is certainly true, in my long experience, is that even if a review committee has little to contribute, the very act of the ICF program getting together to prepare the review material is highly valuable. It helps with communication across broad areas of the program and also helps formulate more clearly an overall and integrated strategy and approach to the program's research results and its plans moving forward.

A rather dramatic event happened mid-way through this process. Our colleagues at LANL, led by Melissa Cray, along with excellent designers such as Bill Krauser, Bernie Wilde, Doug Wilson, Nels Hoffman, Steve Coggeshall, Norm Delameter, Bill Varnum, and David Harris, using the same code, Lasnex, imported from LLNL, separately calculated the proposed ignition target. They did so in an integrated manner of the capsule within the hohlraum. With the laser pointing that Steve Haan and his team had specified, they found the implosion to have a “P4” asymmetry component. The very high convergence amplified this asymmetry, and the implosion fizzled, with an “X” shaped implosion image (rather than the desired “O” shaped!). This had a chilling effect on the review committee, and, frankly, made me appreciate how difficult this task would be. Our own Steve Pollaine worked tirelessly and recalculated the implosion with an adjusted laser beam pointing and achieved ignition (in the code) 53 just barely in time for the next review in the December timeframe. Steve called it his “Channukah miracle of light.” If anything, this episode taught us some humility, and the need for an empirical tuning of symmetry once experiments would begin.

The staff worked extremely hard, and, in the end, did accomplish the NTC. I recall once coming back into the lab past 9 pm, after I had to go out of the lab to a charity dinner event in Oakland. The staff was still there, exhausted but working. I recall specifically Peter Amendt, Chris Keane, and Linda Powers barely able to stand up straight as I talked to them in the hallway that night. I do not think I have ever had a prouder moment as Division leader as I had in that encounter.

From the theory and design point of view, noteworthy is Larry Suter, who was responsible for hohlraum work (featuring Steve Pollaine, Linda Powers, Chris Keane, Ron Thiessen, Tom Shepard, and Peter Amendt), and Steve Haan, who was responsible for capsule implosion and RTI work (featuring Steve Weber, Steve Hatchett, Dave Munro, Kirk Levedahl, and Tom Dittrich). Their experimental LLNL counterparts include Joe Kilkenny, Bruce Remington, Nino Landen, Don Phillion, Bruce Hammel, Brian MacGowan, Fred Ze, David Ress, John Porter, Harry Kornblum, Bob Turner, Siegfried Glenzer, Bob Kirkwood, Chris Darrow, David Montgomery, Ted Orzechowski, and John Moody, along with LANL experimentalists Alan Hauer, Warren Hsing, and Juan Fernandez. Bruce Langdon led the LPI efforts, featuring Ed Williams, Dick Berger, Bert Still, Barbara Lasinski, Kent Estabrook, Denise Hinkel, Chris Decker, Scott Wilks, and Bedros Afeyan, along with the Division's Chief Scientist, Bill Kruer. Other outstanding efforts on equations of state (EOS) in the HEDP regime, which we use to this day, involved Richard More, 54 Yim Lee, 55 David Liberman, and Jim Albritton (who sadly passed away during the writing of this manuscript). Further support for EOS and opacity efforts were provided by the LLNL Physics department, including the efforts of Brian Wilson, Carlos Iglesias, and Bill Goldstein.

Max Tabak led the advanced projects effort that involved Heavy Ion Fusion Target Design 56 (featuring the late Dennis Hewitt, Alex Friedman, David Grote, Jim Mark, Darwin Ho, Grant Logan, Mike Glinsky, Charles Orth, and an up and coming star, Debbie Callahan), pulsed power applications 57 (featuring Jim Hammer, who I had hired into the Division from the MFE part of the LLNL), and fast ignition. 58 The first few sections of that classic paper on fast ignition (and its first two references) lean heavily on the (numerical) gain model of Meyer-ter-Vehn 59 and my extension of it 60 to be entirely analytic, and to move smoothly from the isochoric to isobaric ansatz for the assembled fuel configuration. Rich London continued in my now vacated role, to lead efforts in x-ray lasing, ultra-short pulse work, and laser medicine modeling. His group featured Dave Eder, Steve Maxon, Charlie Cerjan, Rick Ratowski, and Steve Moon. I used to say that my goal as X-Division Leader was to be an “ex-division leader” and to be Rich London's post doc. The great diversity of the division's design efforts was helped greatly by a highly flexible “helper and post processor code,” Yorick, developed by Dave Munro.

George Zimmerman led the code efforts, featuring Judy Harte, Dave Bailey, David Kershaw, Ed Alley, Alexei Shestakov, Jose Milovich, Manoj Prasad, Nick Gentile, and Paul DuBois. George, Judy, and Dave are working at LLNL still to this day. During this time, we hired Marty Marinak who would go on to develop the 3D code Hydra, 61 the present-day workhorse for NIF design. During this period, there were two technical developments that greatly aided our computational efforts. In the 1990-time frame, my colleague (to this day) Eugene Brooks reported to a Supercomputing conference his concept, which he famously called “Attack of the Killer Micros.” This paradigm shifting concept was that efficient and cheaper micro-processors (and enough of then computing in parallel) would surpass the performance of supercomputers then in use (such as the products from the Cray company at that time). A second, synergistic, development, led by Paul Dubois, was to wrap the entire Lasnex code within a shell run in Basis. When we began shifting to micro-processors, Lasnex was ready to utilize them. When machines changed (as they did, rather frequently), the code could be up and running within an afternoon, because of its Basis based portability. These efforts were overseen by Steve Langer of X Division. Other massive codes at LLNL took many weeks to adjust to such changes. These dual, synergistic developments allowed us to respond to the heavy usage need of keeping up with fulfilling the NTC. The lesson learned here was similar to the ICF lessons learned writ large but applied specifically to computing: Be light on your feet, and be ready, willing, and able, to jump ship to a better choice of platforms.

In addition to the target physics addressed by the NTC, the Nova facility embarked on an important undertaking: “Precision Nova.” Without the rigor and precision in the improved facility, the NTC would not have been accomplished. This, to my mind, foreshadows analogous efforts at “precision NIF” in the 2020s that got us “over the hump” and resulted in ignition. A second important effort during this time was the construction of the Beamlet, a prototype of one of the eventual 192 beams of NIF. Much was learned regarding laser technology in the construction and operation of Beamlet. Bruno Van Wonterghem was hired by Mike Campbell to work on the Beamlet. Bruno has been the dedicated NIF facility manager to this day. The Beamlet is now at Sandia National Laboratory (SNL) and serves an important role as a pre-heat source to a cylinder of DT gas that is to be then imploded by pulse power, in the MagLif scheme. 62  

Having completed all 12 milestones, we all thought that we were done with the review process. It was true that the NTC only covered areas within Nova's capability. Thus, no cryogenically frozen DT shells capsules, etc., were tested. Along those lines, the NIF point design 63 involved a rather long laser pulse and thus called for a gas filled hohlraum to partially hold back the Au walls from collapsing inward during that long pulse. Had such motion been allowed, it would greatly challenge the ability to control time-dependent drive symmetry. At the very least, the low Z gas would be an easier medium through which the beams could propagate, vs the high Z plasma of an ingressing gold wall. All the NTC involved vacuum hohlraums. Thus, the NAS committee asked us to study “one more thing.” They wanted us to test the performance of targets with warm gas filled hohlraums on Nova.

We did so. We used thin wall hohlraums to image where the beams propagated to along the wall. For empty hohlraums, the beam imprints had been at axial positions along the cylinder walls to which they were precisely aimed. In short, no surprise. However, with gas in the hohlraums, the beams bent and hit the walls at an axial position closer to the laser entrance holes (LEH) than originally aimed. This indeed was a surprise. While this beam bending behavior was reproducible, and thus correctable with an adjusted aim, as what one would employ in archery (or golf) for “windage,” it raised the possibility of an inability to control symmetry if this behavior persisted or even worsened at the NIF scale. We needed to understand it.

The complete understanding came quickly. Harvey Rose of LANL (who sadly passed away during the writing of this manuscript) visited LLNL and reminded Ed Williams of LLNL that he (Ed) and Bob Short of URLLE had written a paper 64 (with Bob Bingham) describing the beam bending effect. Lasers can filament. In a flowing field of plasma, there are places where the flow field can resonantly stagnate and build up a stronger wall of density within the filament. The beam would then refract off that density wall and bend. The solution that could avoid this phenomenon was to try to avoid the formation of the filament in the first place. LLNL had developed a LPI simulation code pf3D. 65 I worked with Denise Hinkel, then a newly hired post-doc from UCLA, in exercising the code on this problem. Denise showed that the relatively unsmoothed Nova beam would indeed filament, and then showed quantitatively, that the beam bending would ensue. 66 Denise then showed that with the smoothing techniques planned to be implemented at the NIF, that such behavior would be eliminated. This prediction was successfully demonstrated at Nova. 67  

For reasons that are beyond me, I soon found myself chosen to summarize our case, in front of the NAS review committee as they were about to ponder their final decision on whether to recommend to DOE to approve the NIF project. It was obvious to me, that ignition would be difficult and would probably not be achieved right away. Moreover, given the 60× leap in energy scale (and ∼4× in spatial scales) and the new challenges of cryogenic systems, there were bound to be plenty of surprises on the NIF. So that is exactly what I said in my “closing arguments.” I pointed out, however, that the committee, by its own choosing, had required the gas fill campaign, whose beam bending therein was a shining example of an unanticipated surprise and that the wide ICF community had collaborated in reaching an explanation for the surprise and offer up a fix for it on the NIF. As such, we could proceed into the future with our eyes wide open for surprises, but with some confidence that ICF with its flexibility and with its talented workforce acting in a collaborative manner could overcome surprises in the future.

This argument seemed to sway the committee, and to their credit (in retrospect) proceeded to recommend building the NIF.

However, as I look back, fulfilling the NTC was a necessary, but not sufficient accomplishment in bringing approval to the NIF project. External world events were crucial. During the late 1980s and the early 1990s, the Soviet Union collapsed. As a “peace dividend” demanded by the public in response to these events, it became clear that significant expenditures could be avoided by the cessation of nuclear testing. Yet, it would be irresponsible for the Nation to weaken our deterrent by letting nuclear designer skills atrophy. The answer to this was ready and waiting: The use of high-power lasers to drive HEDP experiments that, as described above, were born and then matured at the previous LLNL/ICF lasers. This body of proposed work formed the bedrock of what would become the science-based Stockpile Stewardship Management Program (SSMP) that has lasted for over 30 years, to this day.

I recall the efforts put together by LLNL to brief high U.S. government officials as to this new strategy, shepherded by Dr. Vic Reiss at the Department of Energy (DoE). LLNL director, Bruce Tarter, weapons program associate director George Miller, laser associate director Mike Campbell, Physics Department associate director Dick Fortner, and others at that level were huddled in a small room crafting the presentation. I, a “mere” Division leader, was the lowest ranked person in the group. I was there as the person who had started the HEDP efforts at LLNL (as described above) and was aware of its latest developments. One tact was to also emphasize more basic science that could be done with HEDP, such as laboratory astrophysics. 68 A title of one viewgraph mentioned studying the physics of aging stars. I warned George Miller (who would ultimately give the DoE briefing) that he should be careful how he says this, as he would be in a room “full of aging stars.”

It should also be emphasized that technical progress alone was insufficient here. Much effort on the political side would also be required. The LLNL management, and Mike Campbell in particular, was highly instrumental in getting the three national lab directors to sign a letter that included support of NIF, and in getting every member of the California Congressional Delegation to sign a letter in support of NIF construction. Getting New Mexico Senator Pete Domenici, the “patron saint” of Los Alamos, on board, was also quite critical.

This change in the global political environment, along with the ready willing and able field of HEDP, ultimately won the day for approval of the NIF. The SSMP would have at its cornerstones high performance computing and laser driven HEDP experiments. Achieving ignition on the NIF would have a several-fold utility. First, as a “stretch goal,” it would challenge the workforce to achieve a very daunting task. Second, if achieved, it would open new parts of parameter space into which to extend the domain of achievable HEDP in the laboratory. Third, in the absence of nuclear testing, it would act as an example of “deterrence by capability” for any-and-all adversaries to see. Finally, it would be considered the “waystation” on the path to a larger, 10 MJ scale facility that could reach high gain and high yield of use both to the Stewardship Mission and to the idea of commercial use of ICF, namely, the Inertial Fusion Energy (IFE) enterprise.

In the early 90s, the period discussed above, I feel that another important development came into being. There had been significant and scientifically credible ICF work from the Japanese 69 and the German 70,71 efforts that involved theory and experiments with laser driven hohlraums. The original idea behind classifying this research at U.S. national Labs was to protect the Teller–Ulam scheme for the H bomb. However, as a result of a November 1979 article by Howard Morland published in Progressive Magazine and the legal issues that followed, the Teller–Ulam scheme was officially declassified. Given those facts, there seemed to be no reason left to classify the indirect drive approach being pursued at LLNL. Moreover, given the hoped-for push toward ignition with the NIF, there seemed to be plenty of reasons to engage a world-wide community to help in, and to participate in, this grand challenge quest. I gave the final briefing to advocate for the declassification in Washington, DC, before a broad inter-agency committee. When I flew out of Washington, DC, that night of January 17, 1991, I noticed a strange thing. As the plane flew over the Pentagon, every single light in every single office was on. Later during the flight, it was announced that Operation Desert Storm was under way and that bombs were falling on Baghdad.

Quite apparently, the briefing had its intended effect. It resulted in a declassification in 1994, and subsequent publication of several articles 26,72–74 describing our research efforts in hohlraum drive and in indirectly driven implosions. To this day we benefit greatly from the participation of citizens from around the globe in our efforts, which have, as mentioned, indeed led to ignition. I am quite proud of my role in bringing about this broad participation.

With the NIF facility approved by the DoE, next came the long haul of actually constructing it. When first proposed as simply a “Nova upgrade” built within the existing Nova building, the price tag was a mere 400 × 10 6 dollars. However, laser technology and architecture matured, and a new building would be needed, and the challenge of building a structure while installing a high-tech laser would spell a great managerial and systems engineering challenge. Without steadfast support from many stakeholders, the NIF project would not have survived. Stakeholders include lab management, the other national labs, and the NNSA/DoE (particularly with the help from there of Sheldon Kahalas, Marshall Sluyter, Dave Crandall, Allan Hauer, and Chris Keane). On even higher levels, stakeholders include Congress; influential supporters of science at very high levels of government such as Will Happer (from Princeton University and former head of the Office of Science at DoE), the late Arthur Kerman (from MIT), and Neal Lane (former provost at Rice University, former Presidential Science Advisor, and former head of the NSF). The NIF budget grew approximately fourfold from its initial 1 × 10 9 dollar estimate, but it did reach completion 75 in the summer of 2009. Given the 1993 cancelation of the proposed particle accelerator, the Superconducting Super-Collider in Texas (with 2 × 10 9 spent but perhaps 8 × 10 9 to go), it is remarkable that NIF survived as a project. 76 The decision showed a renewed national resolve to support a big science project.

By the 2000s, the NIF leadership had changed and was under, first, George Miller, and then Ed Moses when Miller became LLNL director. Much vision for the laser developments to follow was provided by Mary Spaeth and co-workers. 77 Ralph Patterson served as NIF Project Director. A great many technologies were developed to be able to reach its specified 500 TW, 1.8 MJ goal. The French government also contributed to this effort, by way of a laser technology co-development agreement. The comparable-to-NIF LMJ laser in Bordeaux is still not fully complete, but with more beamlines added every few years, it is getting there. 78  

Some of the breakthroughs (“the seven wonders of NIF”) needed for NIF to achieve its performance goals include continuous processing of the laser glass manufacturing; precision, programmable, and flexible pulse shaping using fiber optic oscillators and transport to the regenerative, stable, high gain pre-amplifiers; a four-pass angular multiplexed main amplifier, also known as a large aperture optical switch; a large aperture plasma electrode Pockels cell; adaptive optics via the use of deformable mirrors; integrated computer control systems; significant advances in target fabrication, and especially in cryogenic systems including beta layering technique for smooth uniform frozen DT shells, pioneered by the late Larry Foreman of LANL. Much of the target fabrication was done at General Atomics in San Diego, under the able leadership of Abbas Nikroo.

A final need, as mentioned earlier, was the need for large, rapid growth, KDP crystals for the conversion of the 1.06  μ m light to 2ω and to 3ω. Given the end of the Cold War, the NIF project was lucky enough to recruit Natalia Zaitseva from Moscow State University, who had the technological know-how to grow these crystals 10–100 times more rapidly than traditional methods. Without this contribution, we might still be waiting for those giant crystals to be grown.

During the decade that NIF was being constructed, the target physics program did not stand still. The point design 63 was highly instrumental in dictating the exact specifications for both the laser and the target fabrication. It called for a ∼20 ns long pulse, with a sequence of four shocks incident onto the CH ablator, which were meant to keep the frozen DT shell on a very low adiabat, allowing for very high compression. The first shock, about 1 MB in strength, would later come to be known as the “Low Foot” design.

Experiments continued at the Omega laser at the URLLE. Some of that work proved out the strategy and method of beam phasing to control time dependent symmetry. 79 An important platform for NIF, the key-hole platform for measuring shock timing to minimize the adiabat of the implosion, was developed there as well. 80 To be discussed below, an important platform to measure RTI growth was also developed. LPI was studied in a gas bag geometry that tested our LPI codes such as the aforementioned pf3D. The compendium of all these efforts can be found in the previously cited publication of Lindl et al. 52  

The validation of these plasma codes with Omega experiments that measured LPI thresholds and growth factors was quite important. It allowed the staff to plan on what choice of phase plates were to be ordered and built in time for NIF first shots. The tension was as follows: A phase plate that could make the laser beam spot sizes larger would lower the incident irradiance on target and stay below thresholds for instabilities. However, large diameter spots would present difficulties in repointing those beams for possible required symmetry adjustments, as the repointed larger spot laser beams might no longer fit properly inside the LEH of the hohlraum. As with so many issues in ICF, there was a trade-off that resulted in some middle-ground compromise. In this regard, it is possible that the initial results 78 from hohlraums illuminated by a partially built LMJ in France do show some LPI signatures. Those beams have a smaller diameter than those on NIF, and it may be that that is the cause of the LPI seen there to date.

Two specific pieces of physics studied at Omega during the mid-2000s are also worthy of mention: cocktail hohlraums and the high flux model derived from gold sphere experiments.

A hohlraum can be made more efficient than a standard one with gold walls, by making the walls out of a combination of materials, a so-called “cocktail.” The idea is that any dip in the opacity of one material can be compensated by a peak in the opacity of another material at the same frequency. If that dip goes uncompensated, then photons can penetrate deep into the wall, never to emerge. If the opacity is compensated and is high, then the photon is absorbed near the surface of the wall. It can excite an ion and then have it de-excite and re-radiate into 4π. As such, the entire process can be viewed as a scattering event, and thus the cocktail wall has an effectively larger albedo (reflectivity) and will scatter energy back into the internal drive in the hohlraum, thus rendering the hohlraum more efficient.

One way to test the principle is to measure the burn-through times of a cocktail sample on the side of a hohlraum vs that of an identical thickness sample of pure gold. The cocktail should have a delayed burn-through time because effectively it is scattering more of the drive back into the hohlraum, delaying the propagation of the Marshak wave through the sample. We did this in 1996. 81  

The acid test, of course, is to make a hohlraum out of the cocktail material entirely and then see if it gets hotter than a gold one illuminated by the same laser power. In the late 90s, I stepped down from being the ICF theory and design division leader. Nearly a full decade of management was about twice as long as I had originally anticipated, and thus was more than enough for me. Having seen NIF be justified for the SSMP mission, I wanted to understand better what exactly were the problems that NIF could help solve in the realm of HEDP of relevance to the SSMP. During a period of about 10 years, I helped solve two major issues, the so-called “energy balance” problem, and explicating the basics in the so-called “boost” process. In my absence from the ICF endeavor, efforts were made to do the acid test for cocktails, to no avail. The cocktail hohlraums did not get any hotter.

When I returned to doing ICF work, my first assignment was to figure out what went wrong with these disappointing drive experiments with full hohlraums with cocktail walls. I polled several project managers as to where to begin a calculational study, and each gave me pointers in different directions. Had I followed any of them, we would all still be wondering what went wrong. Luckily, an incident at LANL in which a floppy disk went missing for a while (to later be found behind a copier machine), brought about a shutdown of both labs, LANL and LLNL, for several weeks. The idea was to be introspective on how to improve security procedures. The computing facilities were shut down, so all I had to think about problems was pencil and paper. I was able to formulate a hypothesis that the cocktails that had been used, had oxidized, increasing the walls' specific heat, and thus lowering the temperature it could have been without the oxygen. I was able to estimate the effect by hand, and later confirm it when operations returned to normal, and full computing could resume.

The oxidation happened because of the way the cocktail hohlraum was made. A solid cylinder “mandrel” serves as a substrate upon which a hohlraum wall material is deposited. Later the substrate would be dissolved away, leaving an empty cylinder with the appropriate wall material. This etching process accelerated the oxidation into the cocktail, especially the material facing the inside of the hohlraum. I suggested an entirely different process to make the hohlraums. The substrate would be two solid pieces that each looked like a canoe (or a celery stick). The walls of the cylinder would be deposited on the inside of each canoe, and then the canoe mandrel would be dissolved from the outside, leaving the cocktail material that would face the inside of the hohlraum pristine. We tried this on Omega, and the now unoxidized cocktail behaved properly, and the hohlraum got hotter than a similarly driven gold walled one, precisely as predicted. 82,83

The offshoot of this research was to simply ask the question which cocktails were optimal. It turned out that a pure depleted uranium (DU) wall was better than gold, and had ∼15% less wall loss (or, equivalently, a better albedo). When cocktail walled hohlraums for NIF proved too difficult to manufacture consistently, NIF simply used DU, 84 resulting in hotter, more efficient hohlraums. Since wall loss is about half the energy balance in laser dirven hohlraums (the rest of the x-ray energy goes out the LEHs and into the capsule), the DU represented a 7% more efficient hohlraum. With 1.8 MJ incident, and say 1.5 MJ absorbed and converted to x rays, this means that DU “saved” about 100 kJ worth of incident laser energy. As we shall see later, to reach ignition we needed every bit of energy we could squeeze out of the NIF laser, so 100 kJ saving turned out to be a major component in achieving ignition.

A second area of physics tested on Omega was the non-LTE physics of the laser heating gold. In particular, x rays greater than 1.8 keV emitted from the gold could penetrate the capsule ablator and affect the density profile at the ablator–ice interface, leading to the RTI growing there, without the benefit of ablative stabilization. Doping the capsule ablator with some higher Z material can control this density profile, but we needed to know ahead of time (to give the target fabricators time to figure out exactly how to do this doping) how much dopant was needed. Namely, we needed to know the relative size of the >1.8 keV photon emission to the thermal peak near 1 keV (given a 300 eV hohlraum, which is near the peak of its Planckian spectrum).

To do so, Larry Suter suggested we shoot gold spheres on Omega and assigned me the job of modeling the emission we would measure from these gold spheres. To our surprise, the gold spheres emitted 85 thermal x rays at about twice the rate predicted by our then current non-LTE model. That model used XSN, 86 an average ion model that had no “delta-n = 0” transitions. It also used a restrictive flux limit. The local Fick's law of conduction relies on a gradient of the energy density, but in ICF that gradient scale length can be so short as to lead to a nonphysical, too fast, a heat transport. As a result, the electron heat conduction in the computational model needed to be limited to a fraction, f, of the free streaming heat flux, namely, “fnvt,” with f as a variable, and n, v, and T as the electron density, velocity, and temperature, respectively. Based on decades of previous work 87 (notably with mostly unsmoothed beams), that value of f, which fit most data, to that date, was rather restrictive, about 0.03.

To match the gold sphere's high x-ray emission levels, we needed to make two changes. Instead of XSN, we used a DCA (“Detailed Configuration Accounting”) model. 88 Even with that we also needed to increase f to a value of about 0.15. We called this the high flux model 89 (HFM) because compared to the previous non-LTE model it predicted higher x-ray emissive flux and higher electron heat flux. As a result, a fluid element, subject to these dual higher loss channels, would be significantly cooler than a similar fluid element under the older more restrictive model, since the HFM applied to that fluid element would have it lose more of its heat. Many years later, we would find confirmatory evidence for this model by using Thomson scattering to directly measure the temperature of the plasma ablated from the gold sphere. That data showed 90 that indeed it was matched by the cooler predictions of the HFM, and it ruled out completely the restricted heat flux, f = 0.03 model.

Now the reader may well ask, what has all of this HFM, on an open geometry gold sphere, have to do with what would happen in a NIF hohlraum? That is a fair question, but as we shall learn shortly, the initial results of the full NIF experiments showed, to our surprise, that it was the HFM that could explain the data there. A harbinger of this result was also available to us from work on Omega where LPI in hohlraums was studied. 91 The f = 0.03 model was working acceptably well in this experiment with a standard laser beam pointing. As reported by R. London at the APS/DPP 2008 meeting (Paper NO4.5), when the beams were separated and expanded to more uniformly cover the hohlraum wall, a model with f = 0.15 fits the data much better. Similarly, the gold sphere was rather uniformly illuminated, and f = 0.15 fits the data better. In a NIF hohlraum, the 192 beams cover much of the hohlraum walls, and perhaps that is why there too, it seemed like the f = 0.15 model fits the data better. Unfortunately, the point design used the old non-LTE model (XSN and f = 0.03) and that set the program up for the first of several surprises. Thus, the surprises that I had predicted would inevitably occur upon NIF startup indeed would come to pass.

In addition to actual physics campaigns at Omega, there was another class of activity going on before the NIF was completed. This involved two flavors of “Red Teaming.” A Red Team is often employed against a “Blue Team” in war games. The Blue Team represents the strategies and underlying established assumptions, and the Red Team's job is to challenge those very assumptions.

The first exercise was, on one level, a Red Team challenging the assumptions of the Blue Team as to the physics models that underpinned the point design. On another level, it was with a refreshing lack of arrogance, that the NIF strategy was not to believe its physics assumptions per se , but to prepare an empirical tuning campaign to learn what the correct path to ignition would be. As such, this first exercise was to see if this tunability approach 92 would prove viable. Since this was pre-NIF, and the point design was literally a virtual success, it was easy enough for a Red Team to change the physics assumptions that went into that design, so that the point design would fail under the Red Team physics. Examples of the changes would be electron conduction assumptions, opacity and equation of state assumptions, target imperfections, etc. The Red team did redesign the capsule to reach ignition under its physics assumptions. I formulated some of the Red Team physics, and then moved on to be the “referee” of the exercise. Larry Suter headed the Red Team, and John Edwards headed the Blue Team.

A virtual campaign ensued. 93 The Blue Team would propose experiments (e.g., shock timings controlled by the laser pulse shape, symmetry controlled by beam pointing, and beam balance) and the Red Team would carry them out, all virtually, using its physics model. The Red Team would return synthetic “data,” and the Blue Team would then propose to follow on experiments to tune their way to ignition. This exercise did result in “ignition by tuning.” I found it interesting that the Blue Team, true to their physicist nature, hypothesized what the Red Team physics model was. They were quite wrong, but it did not prevent them from reaching (virtual) ignition. Another very valuable outcome from this exercise was the preparation of a great many computational tools to be able to directly compare simulation predictions to actual data signatures.

The second form of “Red Teaming” was to perform a “pre-mortem,” which took place in 2007, 2 years before NIF began shooting. I chaired the multi-lab, multi-expertise Red Team, called the Ignition Risk Reduction Committee (IRRC) that was assigned to this task. It was composed of LPI specialists Hector Baldis and Bill Kruer of LLNL, laser specialist John Murray of LLNL, experimentalists Guy Dimonte of LANL and Dick Fortner, and Mike Key of LLNL, nuclear physicists Ken Moody, Steve Libby, and Richard Boyd of LLNL, and designers of many fields of expertise including John Nuckolls, George Zimmerman, Charlie Verdon, John Lindl, Jim Hammer, Omar Hurricane, all from LLNL, and Mark Herrmann, then of SNL. A pre-mortem means that we pretended that NIF had failed to reach ignition, and then had to imagine or explain why. One issue raised was the kinetics aspect of LPI, and the fear that LPI would raise its ugly head. Another was our deep suspicion that the tent that holds the capsule in place at the center of the hohlraum would perturb the implosion. Our team looked at that issue, but we found that the computational power needed to properly address that issue at the time was insufficient to do so. Seven years later, with improved computational capabilities, the ICF Program would find out the hard truth about this tent issue, as will be reported below.

The revised Red Team continued to meet on occasion once NIF shooting began, in an advisory role to the Program, as an outlet for junior staff to present their non-mainline ideas, and to adjudicate disputes regarding diagnostic interpretations. I chaired this effort throughout most of that time. The revised LLNL team was comprised of John Nuckolls, John Lindl, Nino Landen, Bill Kruer, Erik Storm, Mary Spaeth, Bob Tipton, George Zimmerman, and Brian Pudliner. Participants from outside LLNL, such as Riccardo Betti from the URLLE, and Dov Shvartz from Ben Gurion University (BGU), attended on the occasions when they were visiting on-site.

Of course, a different form of “red teaming” is to have a robust independent effort from another lab, namely, LANL. This “collabo-tition” combination of collaboration and competition proved so useful in the efforts to accomplish the Nova Technical Contract. For reasons unclear to me, the mid-2000s saw a decrease in funding and effort from LANL with regard to NIF and ignition. Nonetheless, some very important LANL work on diagnostics did continue. As we will see, as the NIF targets eventually produced substantial yields, the crucial LANL-supplied information of the shape of the neutron emitting regions was invaluable in understanding the behavior of those targets, and in reconstructing density maps of the imploded core. 94  

In the summer of 2009, NIF was ready to fire all its 192 beams into a target, albeit, at reduced peak power, to slowly “break in” the new laser. The first targets were empty hohlraums. To everyone's surprise, the Dante broad band spectrometer, which looks through the LEH and reports out the soft x-ray emission vs time, showed nearly twice the emission predicted by the “standard” non-LTE model, using XSN and f = 0.03. The HFM matched the data perfectly. 95,96 This fact seemed quite consistent with our experience with the URLLE gold spheres, which, as described above, also emitted nearly twice what the expectations were based on the “standard” model. It gave us a first inkling that the HFM could possibly be applied to NIF targets.

In December 2009, NIF fired its first hohlraums filled with gas and a capsule at the 1 MJ of laser input level. Earlier, such shots in September were at lower energy inputs. These first began as “warm” targets, with a neopentane gas fill. Later these evolved into a helium/hydrogen mixture that needed to be cooled down to allow the proper fill density to be achieved without blowing out the windows that stretched across the laser entrance hole (LEH). The cooling would lower the pressure of that fill gas. More discussions of the possible complications of such cooling will be presented shortly below.

Lo and behold, more surprises arose. The levels of LPI were rather high (about 15%), meaning that there was a decrement in drive due to those losses. Moreover, the spectrum of the stimulated Raman scattering (SRS) signal that went back into the lens was duly recorded. This spectrum can be interpreted to derive the temperature of the plasma from which it scattered, and it turned out to be quite different than the spectrum predicted by the standard non-LTE model. Once again, the HFM predictions matched that spectrum quite closely. Consistent with the levels of LPI, and the observed spectrum, was the notion that the plasma in the hohlraum (or at least the plasma in the part of the hohlraum from whence the SRS came) was significantly cooler than predicted by the standard model. As described above, with regard to the plasma in the blowoff of the URLLE gold sphere, this cooler plasma is due to the HFM's enhanced radiative and conductive cooling. With these results, 97 it seemed clear that the HFM should be taken seriously in the design process. 98  

An LPI phenomenon that can happen when high power laser beams (of slightly different wavelengths) cross within a flowing plasma medium is the possibility of an ion acoustic wave growing and acting as a grating to transfer energy 99 from one beam to another (see Fig. 8 ). This cross beam energy transport (CBET) had been seen in experiments in the 2000s, and now with 96 beams crossing within each of the two LEHS, came to life in earnest at the NIF. The original strategy was to pick a “delta-lambda” (Δλ) between the inner and outer beam lines, to minimize this phenomenon, as the point design did not include this complication. This required the facility to include this option, which it did. However, the new results showed that the inner beams, which penetrated deep into the hohlraums, were the ones encountering the cool plasma and thus showed enhanced LPI (SRS) losses. Moreover, the cooler plasma there led to high beam absorption from the classical mechanism of inverse bremsstrahlung, thus directly impeding the propagation of the inner beams to their desired location deeper into the hohlraum. To restore symmetry under these adverse conditions, CBET was purposely used 100 (via a choice in Δλ) to transfer energy from the relatively unaffected outer beams to the inner beams. This method succeeded in those initial experiments.

Schematic of the phenomenon of cross beam energy transfer (CBET) as it manifests itself in the NIF hohlraum. Only the top half of the hohlraum is shown.

Schematic of the phenomenon of cross beam energy transfer (CBET) as it manifests itself in the NIF hohlraum. Only the top half of the hohlraum is shown.

There were more surprises to come. In 2010, the facility installed more diagnostics and installed its full cryogenic ability. Recall, that at Nova, no cryogenic capsules in hohlraums with shaped pulses were tested, and so the situation was ripe for surprises. It is true that LPI experiments with gas bags, cooled to low temperature to raise the density of the fill gas, were carried out at Nova. 101 That research did evaluate ice forming on the gas bag, but being illuminated by a simple 1.5 ns square pulse made that thin ice layer rather inconsequential.

Similarly, the first implosion results on NIF in 2009, as described above, employed “symmetry capsules” (“symcaps”) that did not have a frozen DT shell. As such shock timing was less of an issue, so that if ice formed on the LEH window, it would not necessarily be noticed. Nonetheless, curious results started to appear as a function of how long the targets were in the chamber before being shot. Now, at NIF, in 2010, full target shots with frozen DT shells, and with their long shaped pulses, with the cryogenics in place showed a curious drop in drive (again, as measured by Dante). Most importantly, the keyhole platform to measure shock timing now showed rather directly some curious results. It was discovered that the vacuum in the NIF target chamber was insufficient to isolate the target. Ice formed on the windows of the cold hohlraum. Detective work by Harry Robey and Cliff Thomas is noteworthy in this regard.

One early solution considered was to simply add extra laser energy onto the foot of the laser pulse to heat up and blow away the ice. Debbie Callahan asked Dan Clark to calculate the effects of this strategy. This was the first manifestation of what later would be called “the high foot” approach. Dan found that the higher foot would make the implosion far more hydrodynamically stable (see Fig. 9 ). We will return to this important point in Sec. VII . The actual solution selected for the ice problem was to install “storm windows” on the hohlraum, namely, double windows with an insulating vacuum gap. This allowed the point design research to proceed as originally planned.

(Left side) Three pulse shapes, with an increasingly higher power in the “foot” of the pulse. Each has a calculated adiabat “α” that rises with the rise in the foot. (Right side) The RTI growth factor (at peak implosion velocity) vs initial mode number of these three pulses, with a decrease in growth as the foot power level is enhanced. Calculations courtesy of Dan Clark of LLNL, and original art provided with his permission.

(Left side) Three pulse shapes, with an increasingly higher power in the “foot” of the pulse. Each has a calculated adiabat “α” that rises with the rise in the foot. (Right side) The RTI growth factor (at peak implosion velocity) vs initial mode number of these three pulses, with a decrease in growth as the foot power level is enhanced. Calculations courtesy of Dan Clark of LLNL, and original art provided with his permission.

In 2011 then, the NIF was ready to field the point design. A series of keyhole platform shock timing measurements showed that empirically the timing could be tuned to the low adiabat goal. 102 The rho-R of the system was measured by the “DSR”—the down scattering ratio of the ∼10 MeV neutrons to the 14 MeV DT fusion “birth energy” neutrons. This is because the 14 MeVs must traverse the dense DT shell (and, to a lesser degree, the remaining unablated CH ablator) on its way out to the detector. The denser that shell, the more rho-R they must traverse, the more the neutrons would be down-scattered in energy. The data did show an increase in DSR with improved shock timing.

However, the DT implosion performance was disappointingly low. With ignition in the 10 18 neutron yield neighborhood, the yields in 2011 were in the 10 14 range. I co-organized, with John Lindl and Mike Key, a summer study of external experts to advise us on how to make further progress. They came from URLLE (Riccardo Betti, Ryan Nora, Valeri Goncharov), Washington University (Jonathan Katz), UCLA (Chan Joshi), France (Catherine Cherfils, Guy Shurtz), Israel (Dov Shvartz, Yoni Elbaz), Italy (Stefano Atzeni), the UK (Steve Rose, Peter Roberts), and LLNL (John Nuckolls, George Zimmerman, Paul Springer, Jim Hammer, Bill Kruer) and spent 2 weeks reviewing the data. The number one advice they reported to us was “Push Longer.” Based mostly on experience 103 from direct drive experiments at the Omega laser at URLLE, the idea is to keep pushing on the implosion even after it is “committed” and its implosion trajectory would not really change much. The reasoning behind this is as follows: Even though the trajectory of the center of mass of the shell would not change, the shell is a hot plasma and can decompress and expand on its way inward. Keeping the drive on longer keeps the shell dense. The “ram pressure” that the shell can deliver to the hotspot gas scales as ρv 2 , so the system can reach higher compressed pressures if ρ stays high. Follow on experiments at NIF the following year proved their advice to be correct. 104  

Nonetheless, the point design, low foot, CH target continued to disappoint. There was evidence of a serious mix of the ablator into the hotspot. 105 Such mix radiatively cools the plasma and lowers yield. Based on the measured surface finish of the capsule, the calculations could not reproduce this mix. We were now facing the “flip side” of deterrence by capability. There were many who were quick to jump to the conclusion that the codes were inaccurate, which brought into question our ability to certify our deterrent capability in the absence of nuclear testing. Those critics seemed blind, to me, to the possibility that it was not the codes that were wrong, but rather it was the assumptions of the initial conditions that were off base. This mix could be “post-dicted” only if a surface finish on the capsule would be artificially enhanced (over the metrologized surface roughness) by a factor of 4.

Symmetry continued to be controlled by CBET, but was reaching the point of diminishing returns by using very high, and increasingly saturated, values of Δλ. The significant levels of measured LPI persisted, with no real confidence that even more, internal LPI (such as side-scatter) may be occurring. In addition, in all of this “noise” of curious and disappointing results, the lesson to “push longer” and minimize coast time was at times ignored/forgotten. With the program, in general, floundering, it was time to invoke ICF's superpower: its inherent flexibility to adjust, innovate, and to set out into new directions. We will discuss that in Sec. VII .

Before we leave the low foot point design we should report, in hindsight, our views as to “what went wrong?” In truth, it took us into the 2014 timeframe to get some clarity on this issue. As mentioned earlier, the Red Team pointed out its fears that the tent holding the capsule in the center of the hohlraum could be a source of perturbation to the implosion. In 2007, computer power was just not capable of properly assessing that. Bruce Hammel kept pursuing the problem, and by 2014, aided by 7 years of “Moore's Law” improvements to computational capabilities, began to shed light on the effect. The initial estimates had treated the tent as simply an extra few dozen nanometers of material on the capsule. Bruce found that the issue was different. In his ab initio calculations, in which the 50 nm tent geometry was initialized with adequate resolution, the tent departed (or, perhaps better described, as “lifted off”) from the surface of the capsule at some azimuthal position. When it was heated, it exploded inward and outward. The inward half impacted the ablator, and at the liftoff position, formed a shaped charge jet. This collision of tent and ablator seeded a pernicious perturbation that led to hydrodynamic instability growth. Bruce was able to follow this perturbation growth all the way into the convergence 40, down to the 25  μ m radius scale. At this fuel assembly scale, using the Hydra code, he showed the tent penetrating into the hotspot. This could explain a great deal of how and why the low foot implosions mixed heavily and failed. 106  

Another important part of the story of the tent problem was a new diagnostic that could image the capsule as it imploded, at least at the convergence 5, at a 200  μ m radius scale. As they have done so consistently throughout the history of ICF, new diagnostics opened our eyes to issues we could not see, or did not even imagine. The images, seen along the waist of the capsule, clearly showed the tent scar (at about plus and minus 45°) on the capsule. 107,108 I cannot help but wonder if we were fooled during the low foot campaign by this tent perturbation. If the tent scar closed off compressional heating above the 45° line (and below the minus 45° line) then a prolate shaped implosion of 3:1 aspect ratio could, upon assembly, look more like a 1:1 implosion. We might have been “tuning” symmetry in a completely wrong place in target performance space! Improvements to the tent problem involved having the tents be “polar,” namely, only holding the capsule tangentially at its north and south poles. 109  

Yet another issue arose and reached some clarity by 2014. There was evidence 110,111 that the CH ablator could be photo-activated to uptake oxygen into its bulk. This uptake could be random, and the oxygen could serve as a perturbation since it is a source of opacity to the x-ray drive. A surface finish metrology would look smooth and not detect these hidden perturbations within the bulk of the ablator. Could we quantify this? Work at URLLE did show 112 this perturbation. 113 On NIF, a very important hydro growth radiography platform 114 was stood up using the same keyhole geometry employed in shock timing, but now with x-ray backlighting. 115 It measured the 3D modulations on a CH driven capsule and found something like a 4× larger growth perturbation than the one predicted assuming the measured surface finish. This eerily harkened back to our need for a 4× surface perturbation to explain mix due to hydrodynamic instability growth on the low foot shots.

Dan Clark performed some rather heroic 3D simulations, which I had termed “kitchen sink” calculations 116 —namely, apply any-and-all sources of perturbation to the low foot implosion calculation using 3D Hydra. The tent was certainly a major source of degradation. However, in truth, the low foot target was failing for a combination of “diseases.” Dan's 3D calculations explained the x-ray and neutron observations reasonably well. The point design may have been the most robust design in 1D with its high gain, but in the real world of 3D and in the real world of a variety of sources of degradation, it was not viable. It was time for a change of design that had 3D stability in mind. A general and extensive review of all the work at NIF to this point in time was published. 117  

Sections VII B and VII C will describe the many changes that were made in our approach to ignition that ultimately led to success. Let me now, first summarize the entirety of that path, quite briefly here, before we delve into the details. This will portray “the big picture” which shows that with each change (in some combination of hohlraum, capsule, and laser pulse) improvements ensued.

I ask the reader's indulgence if the thumbnail sketches here are too brief. All will be explained in detail as we proceed further into this paper. While Fig. 10 may also be considered a “summary” of all these changes, we defer it to later for the same reason: it is best comprehended when all the details are explained.

Target gain vs calendar year. The colors of the narrow data bars represent different target designs. Reproduced from H. Abu-Shawareb et al., Phys. Rev. Lett. 132(6), 065102 (2024). Copyright 2024, American Physical Society.

Target gain vs calendar year. The colors of the narrow data bars represent different target designs. Reproduced from H. Abu-Shawareb et al. , Phys. Rev. Lett. 132 (6), 065102 (2024). Copyright 2024, American Physical Society.

2009–2013: “Low Foot” four-shock pulse, CH ablator, high gas fill hohlraum: The “NIF Point Design.” Low adiabat, high gain potential, and “robust” in a 1D sense. Yields in the kilojoule range.

2013–2016: “High Foot” three-shock pulse, CH ablator, high gas fill hohlraum: Higher adiabat but less gain potential, and much more stable to hydrodynamic instabilities and thus more “robust” in a 3D sense. Yields in the 10 kJ range (eventually 28 kJ), but more importantly, much better “behaved.”

2014–2021: Low gas fill hohlraum with standard LEH size. Lowers the LPI seen in high gas fill hohlraumns, but needs shorter pulse (or other tricks) to control low mode asymmetry.

2016–2022: Shorter (still three shock) laser pulse that goes with a High Density Carbon (also known as diamond), HDC ablator capsule, allows for better symmetry control. Yields in the 50 kJ range.

2018–2022: Increase capsule scale of HDC capsules. Hohlraum is more efficient in coupling to larger capsule, but more challenging for symmetry. Use “I-raum” or CBET to control symmetry. Achieves “burning plasma” (alpha heating exceeds PdV heating). Yields near 200 kJ.

2020: More precision on laser balance, hohlraum diagnostic windows, fill tube size, and HDC capsule quality. All necessary ingredients for making further progress.

2021: Smaller LEH leads to more efficient hohlraum. This allows a longer laser pulse (at less peak power) to keep “pushing longer” on implosion. Yield over 1 MJ and doubling of hotspot temperature due to fusion, so it exceeds the Lawson criterion and is scientifically “ignition.”

2022–2023: A 7% increase in NIF energy (past its original specs) leads to thicker capsule and even longer pulse. Yields of 3–4 MJ and reaches NAS definition of ignition.

So let us begin with the “multiple births” of the high foot. In 2012, there was a community wide brainstorming meeting organized by LLNL's Bill Goldstein (who later became LLNL Director) and Bob Rosner (of the University of Chicago, who is just finishing his term as APS President). This “San Ramon Workshop” 118 was attended by over 150 people, and parsed its sessions into: Laser propagation and x-ray generation, co-chaired by Chan Joshi of UCLA and myself; x-ray Transport and ablation physics, co-chaired by David Meyerhofer, then at URLLE (now at LANL) and Jim Hammer, LLNL; Implosion hydrodynamics, co-chaired by Valeri Goncharov of URLLE and Omar Hurricane, LLNL; Stagnation properties and burn, co-chaired by Riccardo Betti of URLLE and Johan Frenje of MIT; HED materials crosscut, co-chaired by Justin Wark of Oxford University, and Gilbert Collins (then at LLNL, now at URLLE); and Integrated modeling, co-chaired by Don Lamb of the University of Chicago and Marty Marinak, LLNL.

In the ablation physics section of that report, there was mention of a curious result. The DCA model, used on the ablator, predicted a strange “double peaked” structure of pressure vs radius in the ablator. This was a point of concern to which we will return shortly. In the implosion hydrodynamics section were figures that showed that a higher laser power in the picket (the first shock launcher) of the pulse, not yet called “high foot” though that was precisely what it was, led to significant reduction of growth of hydrodynamic instabilities. This was the work of Dan Clark, mentioned earlier, that he had done in response to the “ice on the windows of the hohlraum” issue of 2010.

Shortly after this workshop came a more formal introduction to the high foot concept. The just mentioned problem of the curious double ablation structure early in the pulse in the CH ablator as predicted by the DCA model was investigated by Tom Dittrich and Jim Hammer. I feel partly responsible for this issue even arising. The HFM, which seemed to have proven itself on the NIF experiments to date (and had first arisen in analyzing high Z sphere emission in Omega experiments), called for the use of DCA. However, strictly speaking, its use was proven and recommended solely for high power illumination of high Z elements (and a T of several kiloelectron volts) and not for low power illumination of low Z elements (and a T of 60 eV!). Thus, the use of DCA for the CH ablator early in the pulse was a bit of overzealous “mission creep” by the program. In any event, the team did what any good designer does: Redesign to avoid the problematic and curious result. They proposed a “higher foot” in which even the DCA model had a single hump of pressure vs radius, not a double one. This is all presented in the first part of their publication. 119 For the record, about a year after this work, the DCA model was upgraded in this low temperature problematic part of parameter space, and the double ablation structure disappeared. Luckily for the ignition program, the “design fix” of the high foot was already well on its way to implementation.

In the second part of that same PRL came a crucial result. The high foot led to 2D capsule implosion simulations that survived intact the hydrodynamic instability growth, seeded even by the (thought at the time to be artificially enhanced) “4×” roughness. It was this same “4×” enhanced roughness described earlier, that was leading, calculationally, to the failure of the low foot design. This three-shock, higher adiabat design was a compelling possibility for improving performance, by optimizing on 3D, real-world stability, not 1D idealized performance. Previous attempts at predicting perturbation growth relied on adding perturbation amplitudes of various modes in quadrature. It seems like Mother Nature was being less kind, in that 3D perturbations coupled more perniciously.

Some general comments, and then some particular comments, on the improvement in stability for this design are in order. First, the general lessons. A three-shock system will not be as “true” as a four-shock system in adhering the system to be close to the Fermi-Degenerate adiabat. As such, α will be higher in the three-shock system, rising from the low foot's presumed α of 1.5 to an α closer to 3. Given that P ∼ αρ 5/3 , at the same peak driving pressure, P, the higher adiabat system, namely, higher α, will lead to a lower shell density, ρ. The ablation velocity, V A , is given by (dm/dt)/ρ, where dm/dt is the mass ablation rate that depends on the drive temperature T. Thus, we expect the ablation velocity, V A , to scale as α 3/5 . The reader is referred to my ICF tutorial 16 to follow through on this argumentation, which leads to a smaller in-flight-aspect ratio, namely, a thicker shell upon implosion, with the higher adiabat, α. A thicker shell will be somewhat more impervious to the damage brought on by the RTI. Moreover, as ablative stabilization depends on V A , this again points to the stability advantage of higher α (at a cost in higher gain to be had at lower α). These lessons were all independently learned at the Omega facility at the URLLE, with direct drive. 22 As will be described shortly, experimental evidence at NIF supports these arguments. 120  

Further detailed system studies of the stability of the high foot system uncovered some interesting and useful lessons. Before the shell smoothly accelerates inward, and thus becomes subject to the RTI, the shell is first driven by a shock. This shock is in itself hydrodynamically unstable, as it is subject to the Richtmeyer–Meshkov instability (RMI), which can amplify any initial non-uniformities. The phase of the perturbation may be controlled in such a way as to minimize the size of the perturbation at the time when the RTI kicks in, and, in this way minimize the growth of the initial non-uniformities. 121 This same principle is at work in more recent work, the “SQn” approach, 122 which has a smoother acceleration to minimize the RM seed for the RTI. A more complete discussion of these issue can be found in review paper by Meezan et al. 123 In the early days of the high foot design, Omar Hurricane has related to me that he advocated for dropping four-shock systems to either three or two shocks, specifically to minimize the RMI. Tom Dittrich still advocated for a four-shock system, but with the high foot as the first shock, that four-shock system in Denise Hinkel's hohlraum design was so close to a three-shock system that three shocks were adopted.

Not only was this high-foot scheme attractive in reducing growth rates for the seeds of that “4×” roughness, which probably contributed to the low foot's underperformance, it also improved upon the issue of the pernicious effect that the tent had on the low foot design. The visible “tent scar” seen, in the low foot, in the back-lit image of the capsule implosion taken when it imploded from a radius of 1 mm to a radius of 200  μ m, completely disappeared when the high foot implosion was performed.

From these promising indicators came gratifying results. Yields jumped tenfold, and a 10 kJ yield was comparable to the energy in the final fuel assembly. 124 Far more important, to my mind, was the fact that the implosions were “behaving” far more “rationally” than the low foot. For example, when implosion velocities were increased, leading to higher measured hotspot temperatures, as measured by the neutron time of flight (NTOF) detectors, the yields rose accordingly. In fact, they rose as T 4.1 , just as we would expect from the sigma-v scaling of the DT fusion reaction rate in the 3–4 keV range of the measured T. This is in sharp contrast to the low foot yields rising as T 2. 4 when we would have expected a T 6 scaling in the 1.5–3 keV range of the low foot implosion hot spots. As explained in Sec. VII A , mix from various sources was killing the yield of the low foot, so no “rational” scaling would ever emerge from those experiments.

This “good behavior” of the high foot campaign also allowed systematic studies of other issues. The “push longer” advice of the 2011 Summer Study group could now be tested more systematically on this rational platform. Omar Hurricane and co-workers published a careful study of how important it was to “push longer,” or, in their terminology, lower the “coast time” of the implosion. 125 This important lesson would continue to inform the program in its progress toward ignition. Another way to state the requirement of low coast time is to minimize the radius at which the imploding shell reaches peak velocity. 4,126

In the mid-2010s, the laser program leadership was transferred over to Dr. Jeff Wisoff, who holds that position to this day. The NIF facility was eventually under the guidance of Mark Herrmann and then Doug Larsen, and most recently led by Gordon Brunton. The every-day operation of the facility has been managed ever so faithfully by Bruno Van Wonterghem, to this day. On the program side, the ICF program leader became John Edwards, to be followed a half decade later by Mark Herrmann, and more recently by Richard Town. I am proud of the fact that in the 90s I hired both John Edwards from AWE, and Mark Herrmann from PPPL, knowing full well their leadership potential. Richard Town was hired in the 2000s, after my tenure as X-Division leader was over, by my very worthy successor, Charles Verdon. Both Verdon and Town came over to LLNL from the URLLE.

In this timeframe, some serendipity paid the program a visit. The neutron diagnostics needed calibration from a source that would emit neutrons into 4-pi rather uniformly. A thin shell of CH surrounding a thin shell of DT was imploded with a short 4 ns long pulse. Because the pulse was so short (vs the 22 ns long pulse of the low foot design and the 16 ns long pulse of the high foot design), it was decided to shoot it in a hohlraum that was near vacuum. The hohlraum and capsule performed flawlessly, in accord 127 with the predictions of the HFM. While targets like these are often termed “indirect drive exploding pushers,” Ref. 115 makes it clear that they are not exploding pushers in the true sense of the word, as has been described earlier regarding the first campaign at Shiva. They are a radiation driven thin ablator system that implodes quite rapidly, sends a strong shock ahead of it into the fuel, and gets that fuel hot.

Perhaps most significantly, the LPI levels that had persisted even with the high foot experiments (which had a hohlraum He gas fill of 1.6 mg/cc) disappeared in this new near vacuum platform. For a change, Mother Nature was acting kindly toward the LLNL indirect-drive ICF Program. The coupling of the laser to the hohlraum was in excess of 99%. A lasting lesson from this will end up being: Be light on your feet, and be prepared to take advantage of lucky breaks, and then, be brave enough to change course, and thus, to actually do so. When this hohlraum was later utilized to implode high convergence capsules (with a different ablator material, as will be described shortly) all its advantages persisted nicely. 128  

Another lesson eventually emerged from this experience with near vacuum hohlraums. The capsule symmetry exhibited a surprising behavior. The capsule emission was prolate, implying good propagation of the inner beams to the waist of the hohlraum. The simulations were predicting an oblate implosion, implying that the inner beams were undergoing difficulty propagating to the waist. The designers had to artificially change the wavelength of the laser to get the inner beams into the waist area. 129 This, of course, did nothing for our code credibility. It was hypothesized that the near vacuum hohlraum allowed interpenetration of the plasma flowing from the ablator and from the gold bubble (caused by the outer beams on the walls of the hohlraum), but the code capability was not quite up to computing that reliably. Experiments were done 130 at URLLE to test interpenetration in a cylindrical geometry.

It was not until years later that we ultimately understood the source of this discrepancy. George Zimmerman had put a better multi-fluid penetration package into Lasnex, and Drew Higginson of LLNL was assigned to test it out against these confounding near vacuum hohlraum symmetry results. That package alone did not explain the data. In the interim, Steve Maclaren of LLNL had zoned up a good portion of the double “storm window” hardware outside the LEH to be included in the code. In addition, there was the in-line CBET package. It turned out, that it was, most crucially, CBET occurring in this outside-the-LEH plasma. The CBET enhanced the inner beam strength and resulted in the correct symmetry. 131 The lesson here is that details really matter, and that putting in the correct amount of detail to be properly simulating the reality of the experiment (in this case the storm window and its ensuing plasma formation) is crucial in explaining data and thus in projecting more accurately the plasma conditions and behavior of future targets.

The serendipitous result of the near vacuum hohlraum's elimination of LPI immediately suggested a systematic study of LPI levels vs the amount of gas fill in the hohlraum. 132 That study showed that the SRS backscatter came close to zero for He fills of 0.6 mg/cc and below. The result was explained by plasma physics post-processors, that “post-dicted” very low LPI given the low density and short gradient scale-lengths.

Shortly thereafter, I co-organized, with John Edwards, the next Summer Study session, in 2014. There were participants from AWE (Peter Graham), Ben Gurion University (Dov Shvarts), LANL (Don Haynes, Ray Leeper, Steve Batha), NNSA (Kirk Levedahl, Jeff Quintenz), NRL (Andrew Schmitt), SNL (Mark Herrmann, Mike Campbell), SLAC (Siegfried Glenzer), URLLE (Riccardo Betti, Valeri Goncharov, David Meyerhofer, Craig Sangster), and LLNL (Jim Hammer, George Zimmerman, Paul Springer, Steve MacLaren). The group's final report was unequivocal: Shift all hohlraum work to low gas fills, the need to eliminate the hard to calculate, and hard to precisely diagnose LPI effects were an imperative.

A low density hohlraum gas fill would open up a challenge to achieving good low mode symmetry, as now the gold walls would ingress more than with a high density gas fill hohlraum, and challenge beam propagation and the places where lasers converted their energy to x rays. This would be very difficult for long pulse implosions that were needed for CH ablators. Denise Hinkel and co-workers did succeed in redesigning CH ablator high foot implosions with a somewhat shorter pulse that could, in principle, be symmetrized. 133 However, ultimately, the tent's perturbative effect on that CH capsule would probably remain an issue.

So, along came another principal lesson from this long saga: Diversify. While the CH ablator, high-foot work was being highlighted by the ICF program, an alternative technology was slowly making progress, and doing fundamental and foundational work that benefited by not being in the limelight. Years earlier, “seed money,” through the vehicle of Laboratory Directed Exploratory Research (LDRD) had been devoted to exploring an alternative to CH: ablators made from high-density carbon (HDC). By the way, the entire LDRD process and infrastructure was instituted at LLNL by the initiative of Claire Max who no longer worked in the ICF program but had moved on to do many other things at LLNL. Claire's other activities included starting the LLNL branch of the Institute for Geophysics and Planetary Physics (IGPP), and initiating an effort called the laser guide star that artificially made a layer of the atmosphere light up sodium atoms there, so that specially augmented telescopes could dynamically correct for atmospheric fluctuations and thus sharpen their eye on the universe. Claire is now at U.C. Santa Cruz.

The HDC ablator benefited from needing a much shorter pulse to drive it; hence, it was very well matched to the ICF Program's move to low fill hohlraums. Because HDC's density is about 3.5× that of CH, the first shock has 3.5× less thickness to traverse for it to break out at the ablator/DT–ice's interface. This makes the three-shock pulse for HDC much shorter than the equivalent one for the CH ablator approach. Being out of the limelight allowed the HDC team to perform careful experiments 134,135 throughout the duration of the pulse, ensuring good low mode, P2, symmetry throughout, and avoiding any fuel “sloshing” and other symmetry swings in time that could compromise target performance. 136  

Another lesson, seen throughout this saga, is, again, the role of diagnostics. A wide array of diagnostics were available, each dedicated to a portion of the time history of the implosion, to diagnose the symmetry and allow us to retune to improve it. This too was a long time in development. I recall during the early 90s during the period of executing the NTC, that we presented many of these techniques (and not just in theory, but already tested on Nova) to measure and to ensure time dependent symmetry. 137 Opponents of the indirect drive/NIF project claimed that time dependent symmetry would be too difficult to measure and achieve, but we had already considered the problem and had already prepared ways to address it. All these techniques came to the fore when demonstrating time dependent symmetry for the HDC campaign.

Another advantage of the shorter pulse for HDC was that the shorter it was, the easier it would be to lengthen it slightly in order to “push longer” on the capsule and reduce the coast time for improved performance. The HDC showed better stability to perturbations from the tent but did show sensitivity to perturbations from the fill tube. The fill tube's role is to inject the DT gas in the first place, into the center of the capsule, before the DT gas is frozen in place to form an ice shell.

The HDC first operated at “sub scale,” for instance, at a radius of 0.9 mm, not 1 mm. This allowed for more shots without too much worry of NIF laser damage since it required less incident energy. Before too long, the HDC scale 0.9 capsules were yielding over 50 kJ, 138 which was very promising indeed! HDC (also known as diamond) has a crystalline structure, which can be a seed for RTI. Therefore, the first shock must exceed 12 MB in order to melt those structures. This naturally limits the HDC approach by leading to a higher α and higher adiabats. The way to increase yields then, was to increase the scale of the target. We will discuss that in Sec. VIII .

Before we begin to describe that excursion that finally brought ignition to fruition, namely, going to larger scale capsules, this would be a good point to look back at all the effort described up to this point and remark on the state of understanding of that progress vis-a-vis achieving ignition. Many of the target designs achieved temperatures of about 5 keV and ρR products of the hotspot of greater than 0.3 g/cm 2 . In short, by the conventional Lawson criteria, they were ripe for ignition, but had certainly fallen short in practice. So what was going wrong? I believe that the answer lies in the 3D world in which we live, and not in the 1D criteria that constitute the Lawson criteria.

The published works of both Springer et al. 139 and Patel et al. 140 and co-workers emphasize that if there are 3D perturbations to the implosion, they would lead to 3D thin spots in the confining shell. While this was appreciated in general, 141 their work actually calculated the failure due to 3D effects and related discussion that follows. When those thin spots expand and balloon, their PdV cooling is enhanced over any 1D average expansion. This extra cooling kills the ignition of the capsule. At minimum radius, the system has a d 2 T/dt 2  < 0, leading to a fizzle. A thermal instability, namely, a thermal runaway which we call ignition, needs d 2 T/dt 2  > 0. This criterion for ignition was developed by myself and my late colleague, Abraham Szoke for general systems, and has been successfully applied by Springer (see Ref. 13 of Springer) and Patel to the NIF targets. Their work showed that indeed, these considerations could explain the failure to ignite, despite the 1D and 2D simulations that predicted success. They concluded that to achieve ignition, either better symmetry that cuts down on 3D thin spots or targets with larger ρR for better confinement and thus more robustness would be needed. A larger ρR could be achieved at a larger scale capsule.

These same conclusions were reached by Dan Clark using 3D HYDRA “kitchen sink” calculations. These simulations were performed on significant capsule implosions that included the highest yield at the time, performance cliffs, and experiments that assessed repeatability and hydrodynamic scaling. They captured global trends in the NIF implosion data for the neutron yield, neutron down-scatter ratio (DSR), burn weighted ion temperature, and burn width. These gave better agreement than 2D HYDRA simulations and appear in Fig. 14 of the review paper 142 by Marinak et al. The close level of agreement for this set of highly significant implosions gave us confidence that these simulations were capturing the important implosion physics, including the burn. These simulations showed that various asymmetry sources were acting in concert to degrade the capsule yields and prevent ignition. They indicated that even if we fixed all asymmetry sources to within abilities of target fabrication and the laser, the capsule would still not ignite. These simulations made it clear it was imperative that we develop more robust designs, in particular larger scale capsules.

It was roughly in this time frame that colleagues at LANL released a 2019 report 143 that predicted “with high confidence” that NIF would never achieve ignition, and that a laser about 10× bigger was required. Despite, what was, to my mind, no compelling physics reasoning behind this conclusion, this report seemed to carry weight and influence with various review committees. I think it is to the ICF Program's credit that it persevered despite these negative reports, (much like it did in the dark days of the low-foot campaign) and calmly pushed forward. While target quality and the afore-mentioned 3D non-uniformities were getting in the way of ignition, and leading to the pessimism of the LANL report, the fact that the requisite T and ρR were being achieved really meant (at the very least, in hindsight) that the program was actually “tantalizingly close” to making rapid progress. (Culturally, it was great taboo at that time to use that phrase, in-house, as if it were a “jinx” to progress.) Another way of saying this is that the metric of ignition, known as “ITFX” 117,144 [which stands for: “ I gnition T hreshold F actor (measured) e X perimentally”] was getting quite close to unity.

Increasing the scale of the capsule is a high leverage for increasing yield. Yield should scale as a fusion rate per unit mass, ρσv, multiplied by a confinement time, t, and then multiplied by the mass ρR 3 . Near the hotspot temperature of 4–5 KeV, σv ∼ T 4 , and since ρT is the pressure, P, we get a yield scaling as P 2 T 2 t R 3 . A confinement time scales as R/v. A hydro-equivalent 145 implosion preserves P and v, so yield scales as T 2 R 4 . The arguments for how a larger scale will increase T because of reduced conduction losses result 146 in a T ∼ R 2/7 scaling. Thus, we end up with a yield scaling as R 4.6 , or, with S representing scale, S 4.6 . All this is a yield that is unenhanced by the resultant alpha heating, which will increase the yields even further.

This strong scaling of yield with scale suggested that the program embarks on a campaign of “High Yield Big Radius Implosion Design” (HYBRID). 147 Of course, the capsule is sitting in a hohlraum and must be imploded symmetrically, so that there is minimal residual kinetic energy upon stagnation and that all that kinetic energy of the imploding shell can be transferred into internal, thermal energy of the assembly. The challenge, then, is to put a larger capsule into, roughly, the same size hohlraum as before, since a larger hohlraum would simply soak more energy into its larger area walls, and inefficiently transfer energy to the capsule. A larger capsule into a nearly the same size hohlraum (a smaller “case to capsule ratio”) presents a challenge to providing the needed implosion symmetry (see Fig. 11 ).

Hohlraum and capsule size comparison as we proceded toward ignition scale targets. Reproduced from M. Marinak et al., Phys. Plasmas 31, 070501 (2024) with the permission of AIP Publishing.

Hohlraum and capsule size comparison as we proceded toward ignition scale targets. Reproduced from M. Marinak et al. , Phys. Plasmas 31, 070501 (2024) with the permission of AIP Publishing.

The fact that the program was committed to a low density gas fill makes this symmetry problem even more acute. The larger capsule can get in the way of the inner side of the inner beams trying to propagate to the waist of the hohlraum, They already have that challenge, as the outer side of these inner beams (the side further away from the hohlraum axis) tries to traverse the “gold bubble” coming from the expansion inward of the gold walls illuminated by the outer beams. Debbie Callahn and co-workers published a compendium 148 of NIF data under these conditions, which supported the notion that a larger radius capsule and the longer laser pulse (that must come along with a larger capsule) both exacerbate the symmetry and drive the capsule toward the undesirable oblate shape.

There needed to be a way to break out of these constraints, if the HYBRID campaign were to succeed. One method to do so involved a return to invoking CBET and choosing a Δλ to help bolster the inner beam strength by “borrowing” energy from the outer beams. This method was first proven, under these newer low density gas fill conditions, in the HYBRID C campaign that used CH ablator capsules. 149 It was then adapted by the HYBRID B and the HYBRID E campaigns that used HDC ablators. 150 Another method was to change the hohlraum shape (yes, again invoking ICF's superpower of adaptability). The “I-raum” 151 looked somewhat like a capital letter “I” (in Times New Roman font: I). In the places where the outer beams hit, the hohlraum had a cylindrical radius larger than normal. This meant that the gold bubble had longer to ingress before it interfered with the inner beam, thus allowing the longer pulse to have the inner beams pass by the gold bubble's axial position somewhat less impeded. Of course, the I-raum could also use Δλ if it needed to.

Callahan and co-workers followed up on their “symmetry rules” paper, with another paper 152 that considered some global rules for hohlraum drive. I was proud to help with the research on this aspect. Combining this work with the previous one on symmetry allowed the program to produce a global map of operating space. Plotting case-to-capsule ratio on the y-axis and hohraum diameter on the x-axis mapped out a narrow band (due to symmetry constraints) to show where to optimize absorbed capsule energy (at fixed laser energy). Relieving the symmetry constraint by invoking CBET widened the acceptable operating space and thus increased possible absorbed capsule energy. The choice of how to narrow down this available space even further was made by the notion, mentioned several times above, of the importance to “push longer” and minimize coast time. As discussed above, this physics was somewhat equivalent to finding the minimum radius at which to achieve peak velocity. When this metric was applied, and overlayed on the previous constraints, it became clear what capsule size, hohlram size, and expected energy absorbed by the capsule, to use.

In the 2019–2020 time frame, the initial attempts at scaling up did not go smoothly. The HDC capsules were scaled up from a radius of 0.9 mm to a radius of 1.1 mm. The principal impediment to progress was target quality. There were too many voids, inclusions, and pits (VIPs) that served as initial perturbations for the RTI, and target performance suffered. A new batch of capsules of radius 1.05 mm was tried next. Their quality was improved over the previous batch. Moreover, the somewhat smaller radius could allow for a similar laser pulse to provide a “push longer” environment, reducing coast time and improving performance. Yields of order 170 kJ were achieved, which was the same order as the amount of energy absorbed by the capsules. These milestones were achieved in both platforms, the HYBRID-E cylinder with CBET, and in the I-raum. Careful analysis suggested that we had reached a so-called “burning plasma” in which alpha heating was the dominant contributor to the yield performance. 141,153,154 In retrospect, it should have been obvious that we were close to ignition. However, we had already been on this long path to ignition for over a decade on NIF, and psychologically most (not all!) of us were not prepared for exactly where, when, and how the next step in progress would be made.

A campaign led by Joe Ralph was initiated to develop more efficient hohlraums. A principal tool to do so would be to return to a smaller LEH (used in much earlier NIF experiments). A hohlraum with a smaller LEH would need to have the laser pointing adjusted accordingly. A smaller LEH would be more efficient as there would be less energy lost out of the LEH, allowing for more to be absorbed by the capsule. This more efficient hohlraum would also, critically, allow for a longer pulse (at fixed energy, by having a lower peak power). The lower peak power, in the more efficient hohlraum, would still provide the necessary drive to accelerate the capsule to nearly the same implosion velocity. More importantly, it would allow us to “push longer” on the capsule and minimize coast time. Furthermore, the target fabrication effort provided us with a capsule with a mere 2- μ m diameter fill tube, to minimize the effect of higher Z material jetting into the hotspot from that tube. In addition, we were provided with an excellent quality target with regard to VIPs.

This all came together in shot N210808 on August 8, 2021. An order of magnitude increase in yield, 1.35 MJ was produced. The temperature doubled from the 5 keV achieved by the PdV heating of the implosion, to 10 keV achieved by the fusion process itself. This was ignition in the scientific sense, and the Lawson criteria for ignition was exceeded. 155 The simulations 156 showed that d 2 T/dt 2 was positive at minimum radius, so finally the Rosen–Szoke criterion was achieved as well. A capsule gain of order 6 was achieved, and a target gain of 0.7. We were now certainly close to “achieving ignition” by the NAS metric of unity target gain or greater. There was some discussion within the program as to whether to declare that ignition had been achieved. Our director, Kim Budil (wisely, in my opinion) insisted that we stick to the NAS criterion. Since we, in the long run, would need to increase target performance anyway, it should (and would) be only a matter of time before we would achieve a gain greater than unity.

Attempts at repeating this result fell short. 157 Those attempts were stymied by either mix stemming from target VIPs or mode-1 asymmetries 158 in either capsule shell thicknesses 159 or laser delivery. 160 Near ignition, the consequence of every imperfection is amplified. 161 On average, the yields for the repeats centered at about half the 1.35 MJ yield of N210808. This result should justify the settings for the diagnostics prior to the N210808 shot, which expected that lower level of yield. It was clear, from these repeats, that, just as in Nova, fulfilling the NTC required special efforts needing a “precision Nova,” we now needed special efforts at “precision NIF.”

So, the next element of the program, the NIF laser effort, stepped up to the plate. Not only did they supply more precision in the laser, they were also able to exceed the original NIF energy spec and delivered 162 a 2.05 MJ, 440 TW pulse. The target was adjusted to have a thicker ablator, 2,4 which was better matched to this somewhat longer pulse. This extra thickness would increase the confinement parameter, ρR, and may have also mitigated the degradations caused by whatever mix was happening at the ablator–ice interface, by further insulating the hotspot from that interface.

The first shot did not quite achieve the symmetry needed, but adjustments were made for the second shot. I had full confidence in our design team that they could make this symmetry adjustment successfully. As such, I planned to follow the progress of the shot in real time, but it was delayed into the early morning hours. I made the decision to go to sleep. That was a poor decision, as I was too excited to sleep! I woke up early the next morning to learn the great news: On December 5, 2022, at 1:30 am, shot N221204 (whose sequence to countdown began the previous day, on December 4), a lovely round implosion produced 3.15 MJ and ignition was achieved. 1 See Fig. 1 for a comparison of the drives between N210808 and the ignition shot N221204. While there was some delay in reporting the yield by counting the neutrons, it was clear to me right away by looking at the Dante signal of the x rays coming through the LEH, a yield of order 3 MJ was achieved. These data were available immediately. The ignited target reheats 5 the hohlraum to a temperature higher than the 300 eV brought on by the original laser that drove the implosion. 5 A 50-year-long effort had come to fruition.

A later repeat attempt at 2.05 MJ, in July 2023, shot N230729 produced 3.9 MJ due to even better target quality. Also of note, was the achievement of ignition using 1.9 MJ of light, by adjusting the shock timing to allow for the target to achieve a higher ρR due to more convergence. These two shots, on June 23 and October 7, 2023, yielded 1.9 and 2.4 MJ, respectively. Details and official yields will be published in the near future. In addition, there have been shots using even more NIF energy, 2.2 MJ. The first shot ever tried with that increased energy ignited and yielded 3.5 MJ. A second shot with attempts to fix some asymmetries yielded 5.2 MJ. These results and their official yields should be published shortly.

In short, ignition accomplished! A pictorial summary of all of the above description appears in Fig. 10 .

Applications of ignition are already being discussed. As mentioned above, the reheating of the hohlraum due to the ignited capsule already exceeds the heating of the hohraum due to the original laser. This opens up greater regions of parameter space for HEDP studies.

There is much work yet to be done to deepen our understanding of our results, to date, to help us move into the future.

Targets at the same scale can, in principle, perform better if they can be driven at lower adiabat to higher convergence (and thus higher confinement parameter, ρR). Is convergence in present experiments limited by low mode asymmetry, by shock mistiming, by RTI and its concomitant fuel ablator mix, 163 or by some combination of all of these? Much work remains to illuminate this issue.

Better predictive capability is needed with respect to LPI, drive, and symmetry. The difficulty in quantifying LPI affects the coupling efficiency of the hohlraum, and thus drive and symmetry. LPI can produce sources of capsule preheat, and threatens laser damage if too much SBS scatters back into the lenses. CBET affects symmetry, and its sources of saturation must be better understood. Other very difficult-to-quantify issues, such as non-LTE physics can affect drive and symmetry. Steady progress is under way in improving our NLTE models.

This paper has not gone into detail on the efforts to improve our hohlraum modeling. However, it behooves me to mention the pioneering work of Jim Hammer and Steve Maclaren in devising a very useful platform, the “view factor hohlraum” that allows better experimental access to hohlraum dynamics. This platform has been highly useful in the endeavor to reach a deeper understanding of hohlraum plasmas, from its maiden voyage a decade ago 164 until this day. 165 Better predictive capability can shorten the iteration times for experimental campaigns and is also needed in increasing the credibility of the planning of future facility upgrades. Continued innovation in diagnostic techniques 120 and in code developments 142 will surely aid in all of this.

Pushing onwards toward higher gains 166 can be done along two paths: with the same energy and with higher energy upgrades to the laser.

With the same driver energy, we are pursuing a variety of hohlraums. The frustraum 167 has less surface to volume than cylinders and can thus be more efficient, and thus drive larger capsules. The I-raum, mentioned above, can be combined with Δλ and CBET to perhaps allow for bigger capsules. Mag-raums, which are hohlraums embedded in a B field, can lead to igniting targets at perhaps even less energy, because of its conduction inhibition in the hotspot. 168,169

Higher convergence, via lower adiabat along with more hydro stability, is the path being pursued by the SQ-n approach mentioned earlier. Alternative ablators, such as B 4 C which are created by amorphous layering, may allow for a lower first shock (no need to melt the HDC crystal structures) and lower adiabat. Perhaps a reevaluation of all the work that has been done on Be ablators (design-wise and experiments) would prove fruitful. Of course, we can also continue on the HYBRID-E path with thicker ablators and by improving low mode symmetry. 170 As mentioned above, another slight up-tick in available NIF energy (to 2.2 MJ) has just been provided. The first try, with an imperfect shaped implosion in its initial shot, still provided the second highest yield to date. The second try improved the shape somewhat and 5.2 MJ yield ensued.

There are serious plans to upgrade NIF to the 2.6–3 MJ range. Yields are expected to increase into the many tens of megajoules. 166 As just mentioned, a better predictive capability for LPI, drive, and symmetry will help with the credibility in planning for such an upgrade. Moreover, experiments are planned on the current size NIF to investigate what LPI to expect in the bigger hohlraums designed for the 2.6–3 MJ driver scale.

The lists above should not be considered exhaustive (exhausting, maybe, but not exhaustive). Much more innovation can and should happen in capsule and in hohlraum design. The world is not standing still and simply observing the progress at NIF. The LMJ effort at Bordeaux is pursuing rugby shaped hohlraums. 78 The SGIII laser in China is using hohlraums with an eightfold symmetry. 171 There are world-wide efforts in direct drive, 22 fast ignition, 58 and shock ignition, 172 not to mention several IFE startups each with their own scheme.

Innovations in target fabrication, diagnostic techniques, and broad-band laser technologies are also under way. All of this can help define a minimum size driver that can lead to yields well in excess of 100 MJ, necessary for the stewardship mission as well as the IFE applications. This is simply a wonderful time to be involved in ICF research.

As I hope I have made clear in this short history of the long path to ignition, ignition was achieved after many decades of advances in physics design, simulation codes, lasers, optics, targets, and diagnostics. It has been a bold journey into extreme physics, engineering, and technology that required the long-term persistence of an extremely talented workforce. It required that all participants take the “long view” of how all of this was to be accomplished. In particular, it needed the long-term support of DOE, Congress, and the National Labs. In addition, I think I have demonstrated that, every step of the way, the LLNL indirect drive effort was aided by national and international collaboration and teamwork.

So, what are some of the lessons learned along the way? I list a few as follows:

Seek out and heed wise external counsel . This is somewhat of a tautology, because there were probably a few instances of un-wise counsel as well. An outstanding example of wise council, to my mind, is the advice not to pursue a 10-MJ laser as a follow on to Nova, but to take the riskier, but, at least affordable path, of a 2-MJ NIF. Had we not heeded that, we would probably not have achieved ignition, because we would still be waiting for the 10 MJ facility to be authorized. Other examples include our two outside expert summer study groups who were correct to suggest both “push longer” and to go to low density gas filled hohlraums to minimize LPI.

Optimize on 3D stability, not 1D robustness . The world is 3D and so are ICF implosions. The high foot approach and its successors helped bring target performance into the realm of the understandable, because of its stabilizing features. While these stabilizing, higher adiabat, systems have a reduced “upside” with respect to high gain, they at least give us a basis upon which to build future efforts at lowering the adiabat. As also described herein, this same philosophy of more robustness, but at the cost of “lower gain,” led us to the first successful demonstration of x-ray lasing in the laboratory.

Diversify approaches : Having an independent development path for HDC (vs the mainline CH ablator approach) showed wisdom, especially in hindsight when we needed shorter pulses in low density gas fill hohlraums, and HDC was better matched to that. Too much diversity dilutes the efforts, so a good balance between mainline and alternative approaches must be maintained.

Be prepared to take advantage of surprises : The near vacuum exploding pusher experiment showed a vanishingly small level of LPI. To its credit, the program pivoted to this approach, and had to overcome its own (meta) inertial confinement to do so. While it is one thing to have the ability to be “light on one's feet,” it is entirely another to actually choose to do so.

New diagnostics and new code packages are needed for progress : Looking back over the nearly 50 years I have been engaged in this research, I have seen innumerable times when a new capability (and especially a new diagnostic) came on-line that opened our eyes to physics that was happening and that we could not have imagined without it.

When analyzing data, include all the details : I have seen this many times: The only way to match data with simulations is to put in all the relevant details and include them in the simulations. Two examples come to mind: Nathan Meezan zoning up the solid holder at the waist of the gas bag experiments on Omega. Steve Maclaren zoning up the storm window LEH, and then Drew Higginson's explanation of symmetry in near vacuum hohlraums that involved CBET in that extra LEH plasma.

Much detailed attention and support from upper Lab management is needed : I have seen this way back, nearly 50 years ago from Roger Batzel and Mike May, extending to the special efforts (political and institutional support wise) from Bruce Tarter during NIF construction, through to our latest two directors, Bill Goldstein and Kim Budil.

Utilize a world-wide world-class diverse workforce : The world is a big place and scientists with sharp minds and superb skills can come from anywhere. Getting a declassification of indirect drive ICF approved opened the door for our present extremely talented and dedicated workforce from all over the world.

Above all: Exploit ICF's “super-power”: Flexibility and the ability to innovate . As described in this discourse, the original NIF point design had to be changed in every single aspect for ignition to be achieved. This inherent flexibility of ICF allowed us to respond to whatever hard truths Mother Nature threw in our way. I believe that we must continue to utilize this superpower in order to make progress that will lead to much higher yields in the future.

I must express my appreciation to the current ICF Program leadership. They, and the technical teams that they lead, are the true heroes of this story, as they are the ones that ultimately made ignition a reality: R. Town, N. Landen, J. Moody, B. Spears, D. Hinkel, W. Farmer, A. Kritcher, C. Weber, M. Marinak, A. Pak, T. Chapman, K. Humbird, S. Ross, O. Hurricane, K. Raman, V. Smalyuk, G. Brunton, B. Van Wonterghem, A. Nikroo, A. MacKinnon, B. Woodworth, and M. Stadermann.

This long path counted on the wise guidance and support of LLNL's lab directors throughout the years, and we are grateful for it: J. Foster, M. May, R. Batzel, J. Nuckolls, B. Tarter, M. Anastasio, P. Albright, B. Knapp, G. Miller, B. Goldstein, and K. Budil. Similarly, we thank the LLNL associate directors responsible for the overall program, including R. Woodruff, R. Fortner, B. Goodwin, C. Verdon, K. Budil, B. Wallin, M. Herrmann, and T. Arsenlis. On a more “local” level, we appreciate the dedicated efforts of the actual ICF Program leaders over the years: J. Emmett, H. Ahlstrom, L. Coleman, E. Storm, J. Davis, M. Campbell, J. Lindl, J. Kilkenny, B. Hammel, E. Moses, J. Wisoff, B. MacGowan, J. Atherton, J. Edwards, M. Herrmann, and R. Town, as well as our partners at DoE's NNSA throughout the years.

It is, sadly, the nature of things that if 50 years pass, so will some of the pioneers who toiled and excelled in this field. So, in memoriam, we mention R. Thiessen, Y. Pan, B. Still, O. Jones, A. Szoke, R. Kidder, S. Colgate, J. Murray, H. Powell, J. Trenholme, L. Foreman, C. Hendricks, N. Ceglio, J. Koch, J. Grun, H. Baldis, H. Rose, J. Albritton, D. Liberman, S. Maxon, R. Ratowsky, A. Simon, K. Estabrook, J. Denavit, T. Shepard, M. Feit, and E. Burke.

My many colleagues who have been instrumental in so much of the work described here are too numerous to mention, but I sincerely thank them for their expertise, and their friendship. As this paper is a compendium of history, and I could not have written it without expert contributions of memories and insights from my colleagues, I truly appreciate their input, and thus, I wish to acknowledge J. Lindl, D. Larson, M. Campbell, D. Hinkel, O. Hurricane, J. Edwards, N. Meezan, J. Kilkenny, D. Clark, J. Moody, J. Kline, and N. Landen. The suggestions and clarifications of all three anonymous referees are also hereby acknowledged with gratitude.

As mentioned earlier, LLNL is not alone in this achievement. I thank our colleagues at LANL, URLLE, SNL, NRL, GA, AWE, CEA, and MIT for their valuable contributions to this effort. The current LLNL ICF staff in target physics, code development, NIF facility, target fabrication, and diagnostic development all played a role in this historical achievement, and I salute them.

Finally, a debt that we cannot repay goes to our families and loved ones, who support us all, every day.

This work was performed under the auspices of U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344. This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor Lawrence Livermore National Security, LLC, nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or Lawrence Livermore National Security, LLC. The views and opinions of the author expressed herein do not necessarily state or reflect those of the United States government or Lawrence Livermore National Security, LLC, and shall not be used for advertising or product endorsement purposes.

The author has no conflicts to disclose.

Mordecai D. Rosen: Conceptualization (lead); Writing – original draft (lead).

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

Citing articles via

Submit your article.

h5p course presentation image hotspot

Sign up for alerts

h5p course presentation image hotspot

  • Online ISSN 1089-7674
  • Print ISSN 1070-664X
  • For Researchers
  • For Librarians
  • For Advertisers
  • Our Publishing Partners  
  • Physics Today
  • Conference Proceedings
  • Special Topics

pubs.aip.org

  • Privacy Policy
  • Terms of Use

Connect with AIP Publishing

This feature is available to subscribers only.

Sign In or Create an Account

You are here

Fill in the blanks, primary tabs.

  • View (active tab)
  • xAPI coverage

A free HTML5 based question type allowing creatives to create fill in the blanks, also known as cloze tests, with H5P  in publishing systems like Canvas, Brightspace, Blackboard, Moodle and WordPress.

Would you like to create content like this on your own?

Register on H5P.com to start creating H5P Interactive content. Your content can be accessed via direct link, embeded, or inserted into any learning management system that supports LTI integration.

h5p course presentation image hotspot

Description

Learners fill in the missing words in a text. The learner is shown a solution after filling in all the missing words, or after each word depending on settings.

Authors enter text and mark words to be replaced with an asterix. In addition to native and second language learning, Fill in the blanks can be used to test the learner's ability to reproduce facts or produce mathematical inferences. 

Learn how to create Fill in the blanks in this tutorial .

The H5P content on this page is licensed under Creative Commons Attribution 4.0 International unless another Creative Commons license is specified under rights of use. The author of the content is H5P Group

New to H5P? Read the installation guide to get H5P on your own site.

vev1's picture

Tue, 02/24/2015 - 15:47

It's case sensitive

I typed "cloud" instead of "Cloud" and it was wrong.

  • Log in or register to post comments

Tue, 02/24/2015 - 15:48

Ah I see in the documentation

Ah I see in the documentation now, so you can set the case-sensitivity. Excellent!

Oliver Cupén

Sat, 06/05/2021 - 05:41

el "*" es unidad de fuerza

bayofislands

Thu, 06/04/2015 - 21:43

A single apostrophe fails to match

If a single apostrophe is in the answer it fails to match. for instance say the blank to be filled in is three weeks' time

It appears that a &nbsp;  is used where a plain space should be between weeks' and time

falcon's picture

Fri, 06/05/2015 - 08:11

This is one of several issues

Fri, 06/05/2015 - 12:58

OK sounds a hassle; only our use case is grammar & style tests

 . . . so lots of combinations of apostrophes and other punctuation unfortunately.

Mon, 06/08/2015 - 11:28

You may fix this by removing

You may fix this by removing the wysiwyg editor for that field. Change the widget for that field from html to textarea IIRC and you should be fine.

Related docs: http://h5p.org/adding-text-editor-buttons

Drupal hooks: http://h5p.org/node/2470

Are you using Wordpress?

Mon, 06/08/2015 - 12:02

Ok I'll look into that, experimenting with the Quiz module now

I'm using Drupal 7 latest on Ubuntu 14.04 - currently experimenting with the basic Quiz module again, but I'll confirm your fix asap. Thanks. 

Thu, 06/18/2015 - 11:05

found an alternative

use the button under the escape button to add apostrophe`s ;)

it made my day

Fri, 10/16/2015 - 09:17

Way to accept any answer?

Is there a way to accept any answer?  All I really want the learner to do is to fill out in each sentence an emotion that gets repeated in the same sentence twice.  In each sentence a new emotion needs to be written.  Is there a way I can alter the code to accept any answer?

Tue, 10/20/2015 - 10:43

I guess you could try to

Thu, 01/28/2016 - 16:44

Adding Sound H5P

How can I add sound to the Fill In The Blanks questions? For example pressing a word the word is produced. I am doing this in Opigno Drupal Distribution.

Fri, 01/29/2016 - 18:39

This is not possible at the

nguyentuanmy2206

Fri, 04/15/2016 - 11:38

Resize the input field

I create one kind of Rewrite the sentences excercise, it seems fill in blank is the best solution when i use H5P. Matter is the input fields is much more smaller than the sentences. Is there any way i can make the input field look like a line (example: ___________) or can I resize the input fields, or is there any better solution for me to create Rewrite the sentences excercises? So sorry about my English.

Thank you so much!

Fri, 04/15/2016 - 15:14

Hi! You could try "Put input

Tue, 04/19/2016 - 10:43

Thank you so much, work like

Thank you so much, work like a charm. :)

21cccs's picture

Tue, 08/02/2016 - 01:21

A greater than/less than sign won't show as correct

I've set up a math fill-in-the-blank greater-than or less-than problem -- I have an image that shows a numberline, and the students need to put the following anwsers.

1.  *x < 2*

2.  *x > -2*

Unforunately, when you enter this exactly, they show up as incorrect. Can ">" or "<" be used in this way?

thomasmars's picture

Tue, 08/02/2016 - 09:39

Thanks for reporting this.

There is an existing issue for fixing this, hopefully it will get done in time for next release :)

dschahel's picture

Wed, 11/09/2016 - 10:10

inequalities

Read the following and fill in the blanks with appropriate letters and symbols

JoeGoldfarbH5P

Thu, 05/11/2017 - 23:16

Drag and drop

Hi there. Is there a drag and drop function that could be added to this app, so a user could see the possible solutions and drop them in instead of typing? If not, does anyone know where I could create that?

tomaj's picture

Fri, 05/12/2017 - 06:39

Drag the Words

Have you checked out Drag the Words ?

Wed, 09/20/2017 - 13:06

cant use H5P any more

I don't know if it is a local problem or a bug but I'm not able to use H5P any more on my wp site. Actually, noting gets shown when I insert the short code in content and also, in H5P plugin, the created modules do not work. I tried deactivating all other plugins, in vain. What is this? a bug? It does not only affect fill-in-the blank, but all content types.

My version is 1.9.2

A former wordpress site of mine uses 1.8.4 H5P version, and it works fine. Is it really local or is this malfunction connected with the latest version?

At the moment, H5P is unusable. I can create content but I cannot make this content appear on a WP page or post

BV52's picture

Thu, 09/21/2017 - 07:26

Hi Aron,I'm sorry to hear

I'm sorry to hear that you are unable to use H5P as of the moment. Would you care sharing the version of Wordpress? Could you please look for JavaScript error messages in your console (CTRL + SHIFT + J in Chrome).

Thu, 09/21/2017 - 12:05

Hi there, Thank you for your

Thank you for your help. Actually, it didn't help, I had to deactivate and remove H5P and install and activate again. With this action, to my disappointment, all the previously created H5P content disappeared. I thought, just like with other plugins, that with a removal a reinstall the content would be preserved. Unfortunately, it wasn't the case. The good thing is that I have a backup, maybe I will be able to export from a restored backup all these content and import them back into the new H5P.

The other good thing is that H5P works now as intended, when I create content, they appear, can be tested or inserted into pages or posts.

Btw, I use wordpress 4.8.2 and there were no error messages at all.

Thu, 09/21/2017 - 16:43

Hi Aron,We hae taken note of

We hae taken note of this and will do what we can so that this does not happen again if an uninstallation of H5P is needed.

Sun, 10/01/2017 - 18:12

Still not OK

The problem is that with wordpress 4.8.2 and H5P upgading to 1.9.4, all my previously created content got deleted. Come on, really! Do I have to save each and every content one by one before upgrading and load them with a tedious one by one process? H5P is such a great plugin but this feature (or else the lack of it) makes it a pain in the neck.

Thu, 10/19/2017 - 14:21

How can I add images to this activity?

Fri, 10/20/2017 - 04:57

Hi hoje25,At the top of the

At the top of the editor you should see an option called "Media". That's where you can upload a video or image.

Sat, 11/04/2017 - 10:56

Elementos de la narración

<p>Completa los espacios en blanco con los elementos de la narración</p>

Mon, 11/06/2017 - 04:46

Hi Janeleft,I've google

Hi Janeleft,

I've google translated you comment :-)

Are you suggesting to have an audio recorder or an option to upload audio in a Fill in the blanks activity? If you I suggest you head over to the Feature Request  forum to post your suggestion. Please include use cases and examples on how you would like it to be implemented. I would also suggest to post it in English to reach a wider audience.

peterjonker

Wed, 12/20/2017 - 14:43

Slash part of the answer

I have a question where the user needs to input a numeric code. Slashes are port of the code. Eq: 8480/3/9. Since the slash is used for dividing possible answers I suppose it is not possible to have that available as input character. Peter

Thu, 12/21/2017 - 06:04

Hi Peter,You are correct this

You are correct this is not possible. I would suggest that if the slashes are constant and won't give away the answer you can do it like *8480*/*3*/*9*. I would also suggest that you head over to the Feature Request forum to post this as a suggestion.

Thu, 12/21/2017 - 08:54

Thanks for your prompt reply. With your code it works fine. Did not know the FR forum yet so will post my FR's ther in future.

Thu, 12/21/2017 - 08:56

It's good to here that it

It's good to here that it works for you Peter. If you have any other questions feel free to post in the forums :-)

Sat, 01/27/2018 - 13:58

Upload of an example to Moodle

I'm trying to upload a file (fill-in-the-blanks-837.h5p) to Moodle but there will arise some problems:

Validating h5p package failed.

Missing required library H5P.Blanks 1.8 Missing required library H5P.JoubelUI 1.3 Missing required library H5P.TextUtilities 1.0 Missing required library H5P.Question 1.3

What should I do? Regards, Marge

Mon, 01/29/2018 - 07:52

Hi marge,Thank you for

Thank you for reporting this. I am unable to reproduce the issue tht mentioned above. In order to give your bug report the best chance of getting answered, please include the following information:

  • Does this happen to all conents you try to upload in Moodle?
  • Detailed steps to reproduce the bug (exactly how and when did it happen)
  • Platform and version number. E.g. Drupal, Wordpress, Moodle.
  • Mobile or Desktop
  • Browser: Chrome, Firefox, Safari etc
  • H5P plugin version
  • H5P content type and version (if a content type was used)
  • Any browser console errors
  • Any  PHP errors
  • Screenshots if it's a visual problem

The more information you provide, the quicker the community will be able to fix it and the quicker you'll have a working solution!

Mon, 02/26/2018 - 16:32

Hello, Is there a wildcard option for Fill in the Blanks? I know I can use alternate answers to do roughly the same thing but it would be ideal to use wildcards so I do not have to cover all  possible combinations of typos/spelling errors. Thank you for a wonderful set of tools! tim

Tue, 02/27/2018 - 05:40

Hi Tim,Thanks a lot for

Thanks a lot for contributing your ideas on how to make H5P better! We’re now working on something called the H5P supporter program allowing the H5P community to vote for and fund the top voted H5P features. Also there are developers in the community who every now and then works on a feature they find interesting or useful. In order for your feature request to attract as much interest as possible make sure it follows the below guidelines:

It is clear from every perspective how the feature will work. We recommend describing the feature with one or more user stories, for instance “As an author I want it to be possible to pick between different effects for the check answer animation so that the learners will see a variety of effects and also I can adapt the effects to my target audience(I’ll be using pink unicorns which works really well for both my target audience which are 4 year old girls and venture capitalists)”

If the feature can be illustrated with images or videos it always helps

Make it clear what content types this is relevant for, and or if this is a new content type

Make sure you post the feature in the Feature Request forum.

Fri, 03/02/2018 - 16:44

Thank you for the update and information.

I will put together a post for the Feature Request Forum.

Fri, 05/11/2018 - 19:31

fatal error when there are too many blanks

I wonder if you can reproduce the following problem and if other people have already experienced this issue.

So there is a fill in the blank test and in one fill-in-the blank exercise there are more than 25 blanks. There is immediate feedback set up. Everything goes fine until the very last item and when I would like to finish the whole exercise the page informs me that something happened, either in an error message popup or there is a white blank page and it reloads the whole page.

When there are 10 or 15 blanks in another exercise this error does not occur.

Any ideas? Am I supposed to keep the number of blanks around 10?

Mon, 05/14/2018 - 03:55

Hi mindenaron,I posted a

Hi mindenaron,

I posted a separate thread for your concern. 

eleguedez's picture

Sun, 05/13/2018 - 22:54

Select word from list

Will it be possible to have a function that allows people to choose the answers from a drop-down list in every item instead of writing in?

If this is not possible, could you tell how I can code this? It must be a really easy code, but I'm not a programmer, I'm an English teacher.

Mon, 05/14/2018 - 05:22

Hi eleguedez,I'm afraid this

Hi eleguedez,

I'm afraid this is not possible. You can substitute Single Choice Set  for this by creating a statement with a "blank" and the students can choose from the answers provided. There is also an existing feature request for this. 

Tue, 05/15/2018 - 15:04

RE: Select word from list

Ah ok, I see. Yeah I had thought about that, but of course, it's not exactly the same as it's not possible to insert several single choice sets within one single long paragraph. An SCS is a totally different item type on its own.

An alternative I came up with was actually typing within parentheses the options that I want to prompt, and then the student has to fill in the blanks accordingly. Could be more time-consuming to build, but it does the trick.

Thanks for the reply! It was still very helpful!

aguilardustin

Mon, 12/17/2018 - 18:52

Download fill-in-th-blanks

I'm trying to find the file to download. I have the module on Drupal already, but I don't have the fill-in-the-blanks plugin, so I'm trying to find the file to upload.

Tue, 12/18/2018 - 02:10

Hi aguilardustin,You can

Hi aguilardustin,

You can download the sample content above. The download button is at the bottom of the content.

Thu, 06/27/2019 - 15:32

Do you have these resources for PPT and word please

Unfortunately I do not have moodle and wordpress or drupal have you got reseaoces for microsoft office please?

Fri, 06/28/2019 - 05:25

Hi AMidgley,I'm afraid there

Hi AMidgley,

I'm afraid there is no plugin/feature that let's you use office with H5P.

elihardiehowes

Mon, 01/06/2020 - 22:29

First letter(s) as a clue.

Is it possible to have the first letter as a clue to the blank the user must fill out? Here is an example:  https://www.ielts-exam.net/vocabulary/IELTS_Vocabulary_Missing_Words/1040/?fbclid=IwAR1-8xs5mnAVIQB8YFcMTv0Ah5BUVNROsNI1drulJX3drL3jDBngdIk4mVI

Tue, 01/07/2020 - 00:10

Hi elihardiehowes,I'm afraid

Hi elihardiehowes,

I'm afraid this is not possible. Afaik Fill in the Blanks determines the word as the answer if it has a space prior to the asterisk. Having said this using the first letter as a clue is not possible. However you can post a feature request. In order for your feature request to attract as much interest as possible make sure it follows the below guidelines:

It is clear from every perspective how the feature will work. We recommend describing the feature with one or more user stories, for instance “As an author, I want it to be possible to pick between different effects for the check answer animation so that the learner will see a variety of effects and also I can adapt the effects to my target audience(I’ll be using pink unicorns which works really well for both my target audience which are 4 year old girls and venture capitalists)”

  • Examples & Downloads
  • Documentation
  • Goals & Roadmap

Connect with H5P

  • Contact Form
  • H5P is an open source community driven project. Join the Community and help us create richer online experiences!
  • Project Licensing Information
  • About the Project

H5P is a registered trademark of H5P Group Privacy policy | Copyright © 2024

Feedback

IMAGES

  1. Course Presentation Tutorial

    h5p course presentation image hotspot

  2. Tutorial H5P Image Hotspot

    h5p course presentation image hotspot

  3. h5p: Introduction and Hotspot Image

    h5p course presentation image hotspot

  4. H5P Tutorials: Image Hotspots

    h5p course presentation image hotspot

  5. 3 Tutorial H5P

    h5p course presentation image hotspot

  6. HOW TO: Course Presentation (H5P)

    h5p course presentation image hotspot

VIDEO

  1. H5P: Course presentation

  2. H5P Course Presentation

  3. Tutorial H5P

  4. H5P en Moodle

  5. Mit H5P unterrichten, Interaktiver Unterricht Tutorial

  6. Find the Hotspot -H5P activity

COMMENTS

  1. Image Hotspots Tutorial

    Image Hotspots Tutorial The Image hotspots content type allows you to place an overlay of hotspots on images and graphics. The user presses the hotspots to reveal an associated text.

  2. Hotspots in Course Presentation

    The Image Hotspot is not included in Course Presentation. However there is an option in "Text" and "Image" in Course Presentation to be inserted as a button. I have created a simple content here to demonstrate this. Feel free to download and upload it to your systems. -BV.

  3. How to add image hotspots with course presentation

    T_T. I think that make an option for add other content into Course Presentation. In fact, I need to add 2 function : image hotspots and play audio on the silder for making learning Japanese matierals. I have tried to use Column content for add 2 hotspots in one images , but there's no option to add audio. If have no image hotspot into Course ...

  4. Create H5P Image Hotspots

    A H5P Image Hotspot (also referred to as an interactive image) is an interactive infographic. A hotspot is a clickable area on an image that expands to reveal more information. This information can be text, image, audio, video, weblink, or a combination. This article explains how a H5P hotspot can be created and examples of where they can be used.

  5. EdtechGuides: Create content

    Set up Image Hotspots Learn how to create Image Hotspots in this tutorial. Guidelines for designing accessible hotspot activities General guidelines Clear and comprehensive instructions: In the Description field, provide clear and concise instructions for the hotspot activity. If hotspots link to different content formats (e.g., video clips), explicitly mention this in the description ...

  6. H5P Series

    Using H5P Hot Spots in Moodle 4.x is extremely easy and you can even embed videos and images into the hotspot information. NEW COURSE - How to use Moodle 4.0 (7+ hours of video training) https ...

  7. H5P Image Hotspots

    Navigate to the content bank in your course to create interactive content and visit the H5P image hotspot tutorial to learn how image hotspots work. Once your activity has been created, navigate to the section where the activity should be located.

  8. How to Create Image Hotspots in h5p?

    This video will show you how to create and use image hotspots lesson content in h5p. While there are many elearning authoring tools for interactive content, h5p makes it accessible for anyone to ...

  9. Hotspots

    5 Hotspots There are three types of hotspots. 'Image Hotspots', 'Find the Hotspot' & 'Find Multiple Hotspots' — using right and wrong placements Geography Find Multiple Hotspot Example Economics Example Political Science Example (with optional video)

  10. Image Hotspots

    Image hotspots makes it possible to create an image with interactive hotspots. When the user clicks on a hotspot, a popup containing a header and text or video is displayed. Using the H5P editor, you may add as many hotspots as you like. You can configure: The number of hotspots. The placement of each hotspot, and the associated popup content.

  11. Creating an H5P Course Presentation

    The H5P Course Presentation allows you to create a slide-based presentation containing links, pictures, audio clips, video clips, and various quiz types all seamlessly embedded in the presentation, creating a richer learning experience.

  12. Course Presentation Tutorial

    Course Presentation Tutorial The Course presentation content type allows you to create a slide-based presentation of your learning material. Elements such as slide titles, links, pictures, audio and video clips, as well as various quiz types can be embedded seamlessly right into the presentation for a richer learning experience.

  13. Image Hotspots

    Image Hotspots makes it possible to create an image with interactive hotspots. When the user presses a hotspot, a popup containing a header and text, image or video is displayed.

  14. Using Lumi and H5P to create Easy Images with Hotspots, Timelines ...

    Image hotspots makes it possible to create an image with interactive hotspots. When the user presses a hotspot, a popup containing a header and text or video is displayed.

  15. GitHub

    H5P Image Hotspots Create images with hotspots. For each hotspot you can define header and text content which will be displayed when clicking the hotspot.

  16. Course Presentation Go To Slide Example

    Course Presentation Go To Slide Example - H5P.com. Click the anchor icon to create a 'Go to Slide' radio button. Once inserted, bring this 'transparent' box to the front and place over the desired text/image. Proceed to the next slide to see an example. NEXT SLIDE.

  17. Image Hotspots

    Description. Image hotspots makes it possible to create an image with interactive hotspots. When the user presses a hotspot, a popup containing a header and text or video is displayed. Using the H5P editor, you may add as many hotspots as you like. The following is configurable: The number of hotspots. The placement of each hotspot, and the ...

  18. Photography Guide to the Palouse

    Blogs: Photography Guide To The Palouse I've seen a few blogs and online Palouse photography guides, but none that were comprehensive enough to answer detailed questions that photographers have about this region. So, I decided to fill that void and write one myself. What makes me qualified to write a photography field guide to the Palouse? For starters, I grew up in Idaho and spent four ...

  19. PDF MOSCOW TECHNIQUE

    The MoSCoW technique is used by analysts and stakeholders for prioritizing requirements in a collaborative fashion.

  20. The long road to ignition: An eyewitness account

    This first time ever achievement of more fusion energy produced than the incident laser energy that entered the target, "officially" achieved the definition of ignition, as put forward in a 1997 report by a committee formed by the National Academy of Sciences (NAS). 6 This operational definition (Gain > 1) was chosen to avoid any controversies and lack of consensus over the definition as ...

  21. Hotspot in course presentation

    Including new content types into compound content types is not much of an effort, but testing is (especially for nested content, e.g. Video inside Image Hotspots inside Course Presentation inside Branching Scenario). Could you possibly compile a list of requests for including content types in other content types (or simply create a matrix with ...

  22. Online Courses

    Online courses Online educational courses at the SKOLKOVO Centre for Educational Development will familiarise you with best practice in university transformation and the education of the future. Each course consists of several video lectures. You can choose to receive a completion certificate from the Moscow School of Management SKOLKOVO at the end of each course.

  23. Image Hotspots in Course Presentation

    Currently image hotspots is not included in Course Presentation feature of H5P. According to the H5P documentation, there is a way to add it. It appears that for me to have this feature, I have to add image hotspots as dependency to course presentation via it's library.json file. I have no idea how to do this.

  24. Fill in the Blanks

    Description. Learners fill in the missing words in a text. The learner is shown a solution after filling in all the missing words, or after each word depending on settings. Authors enter text and mark words to be replaced with an asterix. In addition to native and second language learning, Fill in the blanks can be used to test the learner's ...