While the development and use of the atomic bomb effectively ended World War II, it also propelled the world into an even more-powerful nuclear-weapons arena: the "superbomb" or hydrogen bomb. The notion of using a fission weapon to ignite a process of nuclear fusion can be dated back to 1942, when the first major theoretical conference on the development of an atomic bomb took place. Hosted by J. Robert Oppenheimer at the University of California, Berkeley, the conference was notable in particular for the contributions of Edward Teller, who directed much of the discussion toward Enrico Fermi's idea of a "Super" bomb that would utilize the same reactions that powered the sun itself.
Although physicists discovered in the early 1920s that fusion of hydrogen into helium is the energy source of the sun and other stars, it wasn't until nuclear fission was discovered in 1938 and its discovery was published in 1939 that the world knew atomic bombs were possible. In May 1941, Tokutaro Hagiwara, a Japanese scientist at the University of Kyoto, suggested that it would be possible to trigger a thermonuclear reaction between hydrogen atoms by the explosive fission chain reaction of uranium-235. In September of that year, Fermi proposed a similar idea to Teller. The two U.S.-based scientists eventually proposed the idea of utilizing an atomic explosion to initiate thermonuclear reactions in deuterium. Thus, for the next decade, Teller became obsessed with the idea of creating a thermonuclear superbomb.
In 1942, it was believed that a fission weapon would be very simple to develop and that work on a hydrogen bomb might possibly be completed before the end of World War II. But the problems of creating and delivering a "basic" atomic bomb preoccupied scientists for the next few years, much less the more speculative "Super." Therefore, Teller was the only scientist who continued to work on the project, against the will of project leaders Oppenheimer and Hans Bethe.