Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Chess Engines
- Cloud Computing
- Conscious Evolution
- Cosmic Heaven
- Designer Babies
- Donald Trump
- Ethical Egoism
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Life Extension
- Mars Colonization
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- New Utopia
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Quantum Computing
- Quantum Physics
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Teilhard De Charden
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Category Archives: Quantum Physics
Posted: June 24, 2017 at 2:59 pm
June 23, 2017 Single spins in silicon carbide absorb and emit single photons based on the state of their spin. Credit: Prof. David Awschalom
An international team led by the University of Chicago’s Institute for Molecular Engineering has discovered how to manipulate a weird quantum interface between light and matter in silicon carbide along wavelengths used in telecommunications.
The work advances the possibility of applying quantum mechanical principles to existing optical fiber networks for secure communications and geographically distributed quantum computation. Prof. David Awschalom and his 13 co-authors announced their discovery in the June 23 issue of Physical Review X.
“Silicon carbide is currently used to build a wide variety of classical electronic devices today,” said Awschalom, the Liew Family Professor in Molecular Engineering at UChicago and a senior scientist at Argonne National Laboratory. “All of the processing protocols are in place to fabricate small quantum devices out of this material. These results offer a pathway for bringing quantum physics into the technological world.”
The findings are partly based on theoretical models of the materials performed by Awschalom’s co-authors at the Hungarian Academy of Sciences in Budapest. Another research group in Sweden’s Linkping University grew much of the silicon carbide material that Awschalom’s team tested in experiments at UChicago. And another team at the National Institutes for Quantum and Radiological Science and Technology in Japan helped the UChicago researchers make quantum defects in the materials by irradiating them with electron beams.
Quantum mechanics govern the behavior of matter at the atomic and subatomic levels in exotic and counterintuitive ways as compared to the everyday world of classical physics. The new discovery hinges on a quantum interface within atomic-scale defects in silicon carbide that generates the fragile property of entanglement, one of the strangest phenomena predicted by quantum mechanics.
Entanglement means that two particles can be so inextricably connected that the state of one particle can instantly influence the state of the other, no matter how far apart they are.
“This non-intuitive nature of quantum mechanics might be exploited to ensure that communications between two parties are not intercepted or altered,” Awschalom said.
Exploiting quantum mechanics
The findings enhance the once-unexpected opportunity to create and control quantum states in materials that already have technological applications, Awschalom noted. Pursuing the scientific and technological potential of such advances will become the focus of the newly announced Chicago Quantum Exchange, which Awschalom will direct.
An especially intriguing aspect of the new paper was that silicon carbide semiconductor defects have a natural affinity for moving information between light and spin (a magnetic property of electrons). “A key unknown has always been whether we could find a way to convert their quantum states to light,” said David Christle, a postdoctoral scholar at the University of Chicago and lead author of the work. “We knew a light-matter interface should exist, but we might have been unlucky and found it to be intrinsically unsuitable for generating entanglement. We were very fortuitous in that the optical transitions and the process that converts the spin to light is of very high quality.”
The defect is a missing atom that causes nearby atoms in the material to rearrange their electrons. The missing atom, or the defect itself, creates an electronic state that researchers control with a tunable infrared laser.
“What quality basically means is: How many photons can you get before you’ve destroyed the quantum state of the spin?” said Abram Falk, a researcher at the IBM Thomas J. Watson Resarch Center in Yorktown Heights, N.Y., who is familiar with the work but not a co-author on the paper.
The UChicago researchers found that they could potentially generate up to 10,000 photons, or packets of light, before they destroyed the spin state. “That would be a world record in terms of what you could do with one of these types of defect states,” Falk added.
Awschalom’s team was able to turn the quantum state of information from single electron spins in commercial wafers of silicon carbide into light and read it out with an efficiency of approximately 95 percent.
The duration of the spin statecalled coherencethat Awschalom’s team achieved was a millisecond. Not much by clock standards, but quite a lot in the realm of quantum states, in which multiple calculations can be carried out in a nanosecond, or a billionth of a second.
The feat opens up new possibilities in silicon carbide because its nanoscale defects are a leading platform for new technologies that seek to use quantum mechanical properties for quantum information processing, sensing magnetic and electric fields and temperature with nanoscale resolution, and secure communications using light.
“There’s about a billion-dollar industry of power electronics built on silicon carbide,” Falk said. “Following this work, there’s an opportunity to build a platform for quantum communication that leverages these very advanced classical devices in the semiconductor industry,” he said.
Most researchers studying defects for quantum applications have focused on an atomic defect in diamond, which has become a popular visible-light testbed for these technologies.
“Diamond has been this huge industry of quantum control work,” Falk noted. Dozens of research groups across the country have spent more than a decade perfecting the material to achieve standards that Awschalom’s group has mastered in silicon carbide after only a few years of investigation.
Silicon carbide versatility
“There are many different forms of silicon carbide, and some of them are commonly used today in electronics and optoelectronics,” Awschalom said. “Quantum states are present in all forms of silicon carbide that we’ve explored. This bodes well for introducing quantum mechanical effects into both electronic and optical technologies.”
Researchers now are beginning to wonder if this type of physics also may work in other materials, Falk noted.
“Moreover, can we rationally design a defect that has the properties we want, not just stumble into one?” he asked.
Defects are the key.
“For decades the electronics industry has come up with a myriad of tricks to remove all the defects from their devices because defects often cause problems in conventional electronics,” Awschalom explained. “Ironically, we’re putting the defects back in for quantum systems.”
Explore further: Exceptionally robust quantum states found in industrially important semiconductor
More information: “Isolated Spin Qubuits in SiC with a High-Fidelity Infrared Spin-to-Photon Interface,” Physical Review X (2017). journals.aps.org/prx/abstract/10.1103/PhysRevX.7.021046
Harnessing solid-state quantum bits, or qubits, is a key step toward the mass production of electronic devices based on quantum information science and technology. However, realizing a robust qubit with a long lifetime is …
A discovery by physicists at UC Santa Barbara may earn silicon carbide — a semiconductor commonly used by the electronics industry — a role at the center of a new generation of information technologies designed to exploit …
Quantum computersa possible future technology that would revolutionize computing by harnessing the bizarre properties of quantum bits, or qubits. Qubits are the quantum analogue to the classical computer bits “0” and “1.” …
An electronics technology that uses the “spin” – or magnetization – of atomic nuclei to store and process information promises huge gains in performance over today’s electron-based devices. But getting there is proving challenging.
For 60 years computers have become smaller, faster and cheaper. But engineers are approaching the limits of how small they can make silicon transistors and how quickly they can push electricity through devices to create digital …
Entanglement is one of the strangest phenomena predicted by quantum mechanics, the theory that underlies most of modern physics. It says that two particles can be so inextricably connected that the state of one particle can …
An international team led by the University of Chicago’s Institute for Molecular Engineering has discovered how to manipulate a weird quantum interface between light and matter in silicon carbide along wavelengths used in …
Researchers at the U.S. Department of Energy’s Ames Laboratory discovered that they could functionalize magnetic materials through a thoroughly unlikely method, by adding amounts of the virtually non-magnetic element scandium …
(Phys.org)In the late 1800s when scientists were still trying to figure out what exactly atoms are, one of the leading theories, proposed by Lord Kelvin, was that atoms are knots of swirling vortices in the aether. Although …
New research by physicists at the University of Chicago settles a longstanding disagreement over the formation of exotic quantum particles known as Efimov molecules.
Researchers from the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder have demonstrated a new mobile, ground-based system that could scan and map atmospheric gas plumes over kilometer …
In experiments at the Department of Energy’s SLAC National Accelerator Laboratory, scientists were able to see the first step of a process that protects a DNA building block called thymine from sun damage: When it’s hit with …
Adjust slider to filter visible comments by rank
Display comments: newest first
How many times is Phys.org going to repeat this fallacy ?
The distance of this influence is definitely limited by decoherence, i.e. the tendency of vacuum fluctuations (which manifest itself like the CMB radiation and thermal noise) to disrupt the entangled state (i.e. to desynchronize pilot waves of entangled objects). Inside the diamond or silicon carbide (which is similar to diamond in many extents) the strength of bonds between atoms is so high, that the effects of thermal vibrations are diminished, which makes these materials perspective systems for storage of spin and another states of atoms. I just don’t think, that these states are quantized, because they require many quanta of energy (more than 10.000 photons) for switching their spin state. IMO they’re rather close to classical systems of storage information within laser pulses, like the layers of dyes etc.. The another question whether the speed of this influence is infinite is also disputable, despite that we have indicia, in pure quantum system it gets actually superluminal.
Entanglement is two photons created at the source with opposite spins which sum to zero. There is no such thing as spooky action at a distance, full stop.
Please sign in to add a comment. Registration is free, and takes less than a minute. Read more
Posted: June 23, 2017 at 6:47 am
By Ivan Potocki, ContributorPublished: June 22, 2017 07:01 EST
Many of the answers to lifes great questions have been laid at the door of the mega-brained scientists who specialise in quantum physics. Is there evidence of a god? How did the universe begin?
But what about using the theories to revolutionize how we play casinos?
A team of scientists from China and Bristol has come up with the idea of a gambling protocol that doesnt depend on the integrity of the participants. Instead, this new protocol is founded on the idea of rationality the rational notion that both parties will make decisions they perceive give them the best winning chances.
This new protocol is based on the mix of game theory and quantum mechanics, and scientists believe it could find its application in casinos and lotteries sometime in the future.
It is nearly impossible for two players to gamble, putting something of value on the line, without having a third party supervising the game because of the temptation to bend the rules or cheat. This third party is necessary to make sure everything is fair, and everyone keeps their end of the bargain. However, it seems that quantum mechanics has a solution that would remove the need for the third party altogether.
The idea of quantum gambling revolves around the concept of a theoretical machine constructed between two participating players. The machine works based on two important principles: quantum superposition and Heisenbergs uncertainty principle.
The uncertainty principle is a bit hard to understand for people not familiar with quantum mechanics, but it basically states that observing a particle will create changes in its behavior. Quantum superposition means that the particle can be in the two different states at once.
If this sounds confusing, thats because it is.
But, the gist of it all is, it would create a situation where one player knows the state of two particles on his or her side but doesnt know if the states will change by the time they reach the other player. The other player has an option to try and guess the state of the particle hes been sent, or ask for a different one.
In theory, this would create an environment where both players need to adhere to the best strategy, creating Nash equilibrium.
In this situation, they are playing a zero sum game, and there is no need for third parties to supervise the game. Although this idea only exists on paper at this time, scientists believe it can be used to develop a range of new gambling protocols based on quantum mechanics.
Posted: at 6:47 am
June 22, 2017 Artist’s rendering of a quantum thermometer. Credit: Emily Edwards/JQI
In an arranged marriage of optics and mechanics, physicists have created microscopic structural beams that have a variety of powerful uses when light strikes them. Able to operate in ordinary, room-temperature environments, yet exploiting some of the deepest principles of quantum physics, these optomechanical systems can act as inherently accurate thermometers, or conversely, as a type of optical shield that diverts heat. The research was performed by a team led by the Joint Quantum Institute (JQI), a research collaboration of the National Institute of Standards and Technology (NIST) and the University of Maryland.
Described in a pair of new papers in Science and Physical Review Letters, the potential applications include chip-based temperature sensors for electronics and biology that would never need to be adjusted since they rely on fundamental constants of nature; tiny refrigerators that can cool state-of-the-art microscope components for higher-quality images; and improved “metamaterials” that could allow researchers to manipulate light and sound in new ways.
Made of silicon nitride, a widely used material in the electronics and photonics industries, the beams are about 20 microns (20 millionths of a meter) in length. They are transparent, with a row of holes drilled through them to enhance their optical and mechanical properties.
“You can send light down this beam because it’s a transparent material. You can also send sound waves down the beam,” explained Tom Purdy, a NIST physicist who is an author on both papers. The researchers believe the beams could lead to better thermometers, which are now ubiquitous in our devices, including cell phones.
“Essentially we’re carrying a bunch of thermometers around with us all the time,” said JQI Fellow Jake Taylor, senior author of the new papers. “Some provide temperature readings, and others let you know if your chip is too hot or your battery is too cold. Thermometers also play a crucial role in transportation systemsairplanes, carsand tell you if your engine oil is overheating.”
But the problem is that these thermometers are not accurate off the shelf. They need to be calibrated, or adjusted, to some standard. The design of the silicon nitride beam avoids this situation by relying on fundamental physics. To use the beam as a thermometer, researchers must be able to measure the tiniest possible vibrations in the beam. The amount that the beam vibrates is proportional to the temperature of its surroundings.
The vibrations can come from two kinds of sources. The first are ordinary “thermal” sources such as gas molecules buffeting the beam or sound waves passing through it. The second source of vibration comes purely from the world of quantum mechanics, the theory that governs behavior of matter at the atomic scale. The quantum behavior occurs when the researchers send particles of light, or photons, down the beam. Struck by light, the mechanical beam reflects the photons, and recoils in the process, creating small vibrations in the beam. Sometimes these quantum-based effects are described using the Heisenberg uncertainty relationshipthe photon bounce leads to information about the beam’s position, but because it imparts vibrations to the beam, it adds uncertainty to the beam’s velocity.
“The quantum mechanical fluctuations give us a reference point because essentially, you can’t make the system move less than that,” Taylor said. By plugging in values of Boltzmann’s constant and Planck’s constant, the researchers can calculate the temperature. And given that reference point, when the researchers measure more motion in the beam, such as from thermal sources, they can accurately extrapolate the temperature of the environment.
However, the quantum fluctuations are a million times fainter than the thermal vibrations; detecting them is like hearing a pin drop in the middle of a shower.
In their experiments, the researchers used a state-of-the-art silicon nitride beam built by Karen Grutter and Kartik Srinivasan at NIST’s Center for Nanoscale Science and Technology. By shining high-quality photons at the beam and analyzing photons emitted from the beam shortly thereafter, “we see a little bit of the quantum vibrational motion picked up in the output of light,” Purdy explained. Their measurement approach is sensitive enough to see these quantum effects all the way up to room temperature for the first time, and is published in this week’s issue of Science.
Although the experimental thermometers are in a proof-of-concept phase, the researchers envision they could be particularly valuable in electronic devices, as on-chip thermometers that never need calibration, and in biology.
“Biological processes, in general, are very sensitive to temperature, as anyone who has a sick child knows. The difference between 37 and 39 degrees Celsius is pretty large,” Taylor said. He foresees applications in biotechnology, when you want to measure temperature changes in “as small an amount of product as possible,” he said.
The researchers go in the opposite direction in a second proposed application for the beams, described in a theoretical paper published in Physical Review Letters.
Instead of letting heat hit the beam and allow it to serve as a temperature probe, the researchers propose using the beam to divert the heat from, for example, a sensitive part of an electromechanical device.
In their proposed setup, the researchers enclose the beam in a cavity, a pair of mirrors that bounce light back and forth. They use light to control the vibrations of the beam so that the beam cannot re-radiate incoming heat in its usual direction, towards a colder object.
For this application, Taylor likens the behavior of the beam to a tuning fork. When you hold a tuning fork and strike it, it radiates pure sound tones instead of allowing that motion to turn into heat, which travels down the fork and into your hand.
“A tuning fork rings for a long time, even in air,” he said. The two prongs of the fork vibrate in opposite directions, he explained, and cancel out a way for energy to leave the bottom of the fork through your hand.
The researchers even imagine using an optically controlled silicon nitride beam as the tip of an atomic force microscope (AFM), which detects forces on surfaces to build up atom-scale images. An optically controlled AFM tip would stay cooland perform better. “You’re removing thermal motion, which makes it easier to see signals,” Taylor explained.
This technique also could be put to use to make better metamaterials, complex composite objects that manipulate light or sound in new ways and could be used to make better lenses or even so-called “invisibility cloaks” that cause certain wavelengths of light to pass through an object rather than bouncing from it.
“Metamaterials are our answer to, ‘How do we make materials that capture the best properties for light and sound, or for heat and motion?'” Taylor said. “It’s a technique that has been widely used in engineering, but combining the light and sound together remains still a bit open on how far we can go with it, and this provides a new tool for exploring that space.”
Explore further: Fundamentally accurate quantum thermometer created
More information: “Quantum correlations from a room-temperature optomechanical cavity” Science (2017). science.sciencemag.org/cgi/doi/10.1126/science.aag1407
Xunnong Xu et al. Cooling a Harmonic Oscillator by Optomechanical Modification of Its Bath, Physical Review Letters (2017). DOI: 10.1103/PhysRevLett.118.223602
Better thermometers might be possible as a result of a discovery at the National Institute of Standards and Technology (NIST), where physicists have found a way to calibrate temperature measurements by monitoring the tiny …
What do ships, bats and torpedoes have in common? They navigate by emitting sound waves and listening where those get absorbed or reflected. Humans do the same with light waves, except that they rely on external sources like …
Interconnecting different quantum systems is important for future quantum computing architectures, but has proven difficult to achieve. Researchers from the TU Delft and the University of Vienna have now realized a first …
Invisible to the human eye, terahertz electromagnetic waves can “see through” everything from fog and clouds to wood and masonryan attribute that holds great promise for astrophysics research, detecting concealed explosives …
Researchers working at the National Institute of Standards and Technology (NIST) have developed a “piezo-optomechanical circuit” that converts signals among optical, acoustic and radio waves. A system based on this design …
For the first time, researchers at the California Institute of Technology (Caltech), in collaboration with a team from the University of Vienna, have managed to cool a miniature mechanical object to its lowest possible energy …
Elemental metals usually form simple, close-packed crystalline structures. Though lithium (Li) is considered a typical simple metal, its crystal structure at ambient pressure and low temperature remains unknown.
In an arranged marriage of optics and mechanics, physicists have created microscopic structural beams that have a variety of powerful uses when light strikes them. Able to operate in ordinary, room-temperature environments, …
Traditional cameraseven those on the thinnest of cell phonescannot be truly flat due to their optics: lenses that require a certain shape and size in order to function. At Caltech, engineers have developed a new camera …
Scientists have solved a decades-old puzzle about a widely used metal, thanks to extreme pressure experiments and powerful supercomputing.
Screens on even the newest phones and tablets can be hard to read outside in bright sunlight. Inspired by the nanostructures found on moth eyes, researchers have developed a new antireflection film that could keep people …
(Phys.org)A team of researchers at Universite Paris-Diderot has uncovered the reason for wobbling of wheeled suitcases. In their paper published in Proceedings of the Royal Society A, the group explains the physics behind …
Please sign in to add a comment. Registration is free, and takes less than a minute. Read more
Read this article:
Posted: at 6:47 am
Big data is a challenge for all automakers, but especially German companies because they target affluent customers who want the latest technology.
At the same time, the focus on computing pits the automakers against Silicon Valley tech companies with far more experience in the field, and creates an opening for firms like Apple and Google, which are already encroaching on the car business.
Google has long been working on self-driving or autonomous cars, and Tim Cook, the chief executive of Apple, said this month that the company best known for making iPhones is focusing on autonomous systems for cars and other applications.
That has put pressure on automakers. German companies in particular have already made investments in ride-sharing services, in part to combat the rise of Uber, and are now looking further into the future.
Efforts by Volkswagen, trying to remake itself as a technology leader as it recovers from an emissions scandal, show how far into exotic realms of technology carmakers are willing to go.
Volkswagen, a German company, recently joined the handful of large corporations worldwide that are customers of D-Wave Systems, a Canadian maker of computers that apply the mind-bending principles of quantum physics.
While some experts question their usefulness, D-Wave computers housed in tall, matte black cases that recall the obelisks in the science fiction classic 2001: A Space Odyssey can in theory process massive amounts of information at unheard-of speeds. Martin Hofmann, Volkswagens chief information officer, is a believer.
For us, its a new era of technology, Mr. Hofmann said in an interview at Volkswagens vast factory complex in Wolfsburg, Germany.
First theorized in the 1980s, quantum computers seek to harness the strange and counterintuitive world of quantum physics, which studies the behavior of particles at the atomic and subatomic level. While classical computers are based on bits with a value of either 1 or 0, the qubits in a quantum computer can exist in multiple states at the same time. That allows them, in theory, to perform calculations that would be beyond the powers of a typical computer.
This year Volkswagen used a D-Wave computer to demonstrate how it could steer the movements of 10,000 taxis in Beijing at once, optimizing their routes and thereby reducing congestion.
Because traffic patterns morph constantly, the challenge is to gather and analyze vehicle flows quickly enough for the data to be useful. The D-Wave computer was able to process in a few seconds information that would take a conventional supercomputer 30 minutes, said Florian Neukart, a scientist at a Volkswagen lab in San Francisco.
Such claims are met with skepticism by some experts, who say there is no convincing proof that D-Wave computers are faster than a well-programmed conventional supercomputer. And unlike a quantum computer, a supercomputer does not have components that must be kept at temperatures colder than deep space.
If this were an application where D-Wave were actually faster, then it would be the first time wed ever seen that, said Scott Aaronson, a vocal D-Wave skeptic who is a professor of theoretical computer science at the University of Texas at Austin.
It would be particularly astonishing that this milestone should happen first for a Volkswagen application problem, Mr. Aaronson said in an email.
Volkswagen executives say they will publish the results of their work with D-Wave computers, allowing outsiders to try to debunk them.
If the D-Wave collaboration proves to be a misstep for Volkswagen, it would illustrate the hazards of big data for companies whose main focus for the past century has been the internal combustion engine. It also reflects the stakes for one of the worlds biggest carmakers.
Suppliers are also gearing up for an era of automotive big data. Bosch, the electronics maker based in a suburb of Stuttgart, said Monday that it would invest 1 billion euros, or $1.1 billion, to build a new factory in Dresden to produce chips for a variety of applications, including the sensors used in self-driving cars.
Bosch prefers to build its own chips rather than buy them from a supplier, said Christine Haas, director for connected services at the company. When you have done it yourself, then you have a much deeper understanding of the technology, she said.
Some car companies have decided to concentrate on what they do best and let others handle the computing.
Volvo Cars has been a pioneer in marrying digital technology and automobiles. It has turned to outside providers like Ericsson, a Swedish maker of telecommunications equipment, for computer technology. In May, Volvo said it would install Googles Android operating system in new cars beginning in 2019. And the company is cooperating with Uber to develop self-driving cars.
We are trying to embrace it, said Martin Kristensson, senior director for autonomous driving and connectivity strategy at Volvo, of the challenge from Silicon Valley.
But, like Volkswagen, many are trying to develop capabilities in-house. Mr. Stolle of BMW said that the carmaker which hired more information technology specialists last year than mechanical engineers needs huge data-crunching capability.
The company has a fleet of 40 prototype autonomous cars it is testing in cooperation with Intel, a chip maker; Mobileye, an Israeli self-driving technology company; and Delphi, an auto components supplier.
BMW uses artificial intelligence to analyze the enormous amounts of data compiled from test drives, part of a quest to build cars that can learn from experience and eventually drive themselves without human intervention.
After test sessions, hard disks in the cars are physically removed and connected to racks of computers at BMWs research center near Munich. The data collected would fill the equivalent of a stack of DVDs 60 miles high, Mr. Stolle said.
That is much more than could be efficiently transmitted over the internet to remote data storage facilities operated by outside providers in the cloud.
A large part of the data center has to be on premises, Mr. Stolle said. The amount is so huge it doesnt work in the cloud.
Follow Jack Ewing on Twitter @JackEwingNYT.
A version of this article appears in print on June 23, 2017, on Page B4 of the New York edition with the headline: Europes Car Giants Race to Outsmart Apple and Google.
Posted: June 22, 2017 at 5:44 am
June 21, 2017• Physics 10, 68
A team of experimentalists and theorists proposes a scalable protocol for quantum computation based on topological superconductors.
Adapted from T. Karzig et al., Phys. Rev. B (2017)
The Herculean thrust to realize a quantum computer by many research groups around the world is, in my opinion, one of the most exciting endeavors in physics in quite some time. Notwithstanding the potential applications that have motivated many companies in this endeavor, a quantum computer represents the most promising avenue to peer into quantum phenomena on a macroscopic scale. As with any such great effort, the race to build a quantum computer has many competitors pursuing a variety of approaches, some of which appear to be on the verge of creating a small machine . However, such small machines are unlikely to uncover truly macroscopic quantum phenomena, which have no classical analogs. This will likely require a scalable approach to quantum computation. A new study by Torsten Karzig from Microsoft Station Q, California, and colleagues  brings together the expertise of a large and diverse group of physicists, ranging from experimentalists to topologists, to lay out a roadmap for a scalable architecture based on one of the most popular approaches.
Karzig and colleagues paper represents a vision for the future of a sequence of developments that started with the seminal ideas of topological quantum computation (TQC) as envisioned by Alexei Kitaev  and Michael Freedman  in the early 2000s. The central idea of TQC is to encode qubits into states of topological phases of matter (see Collection on Topological Phases). Qubits encoded in such states are expected to be topologically protected, or robust, against the prying eyes of the environment, which are believed to be the bane of conventional quantum computation. This is because states of topological phases are locally indistinguishable from each other, so that qubits encoded in such states can evade the destructive coupling to the environment. But experimentally accessible topological phases of matter with the requisite properties for TQC, such as the ability to host quasiparticles known as Majorana zero modes, have been elusive. A milestone in this direction was reached in 2010, when researchers realized  that the combination of rather conventional ingredients, such as special semiconductors, superconductors, and magnetic fields, could result in one such phasea topological superconductor. This realization motivated experimentalists to discover signatures of this topological phase just a few years after its prediction . However, the topological superconductors, or Majorana nanowires as they are often called, made in these first experiments were plagued by device imperfections such as impurities . While topological robustness is supposed to protect devices from small imperfections, it is sometimes overlooked that the strength of such imperfections must be below a pretty low threshold for topological robustness to be operative.
A new wave of optimism swept the search for TQC-ready topological superconductors in 2016. Thats when experimental groups from the University of Copenhagen and from the Delft University of Technology, led by Charlie Marcus and Leo Kouwenhoven, respectively, demonstrated high-quality Majorana nanowires that were likely to be in the topological regime [9, 10]. These devices, fabricated through epitaxial growth of superconducting aluminum on indium antimonide semiconductors, showed evidence of a high-quality superconducting gap  and also of near energy degeneracy between the topological qubit states ; a large energy difference between qubit states is often related to the detrimental decoherence rate of a qubit. However, the rules of the game of designing and fabricating Majorana nanowire devices have proven to be rather different from what had been anticipated. For example, it turns out that it is quite straightforward to drive the newly fabricated devices  into the desirable Coulomb blockade regime (where the quantization of electronic charge dominates charge transport) but difficult to fabricate controllable contacts to connect the devices to superconducting circuitry. Interestingly, concurrent theoretical work has clarified that the topological qubit state of a Majorana nanowire can be measured via the phase shift of electron transport through the device when the transport is in the Coulomb blockade regime. This work led to suggestions that the basic operations for TQC could be performed using a procedure that relied on measurements of topological qubits.
Karzig and colleagues study comes at a point in time where there is optimism for the realization of TQC using Majorana nanowires but possibly along a path with several constraints. For example, branched structures of a nanowire could be used to generate a network of wires for TQC, but superconducting contacts are only easy to make at the ends of the wire. This would mean that superconducting contacts must be avoided in making a large network of wires. Also, the qubit lifetime will ultimately likely be limited by quasiparticle poisoning, a phenomenon in which an anomalously large number of unwanted quasiparticles, arising from Cooper electron pairs broken by stray microwaves, exists in the devices. The Karzig study brings together a large number of authors with expertise in device fabrication, in strategies for TQC, and in the solid-state-physics issues involving Majorana nanowires. The researchers propose a protocol for scalable TQC based on the existing Majorana nanowires, assuming that they can be brought into the topological phase.
The protocol involves designing a network from small sets of Majorana wires and performing a sequence of measurements on the sets (Fig. 1). The central idea is to use physical constraints on the network, such as aligning all wires with a global magnetic field, to predict which sets may be measured easily to perform TQC. For example, the researchers considered networks made from sets of four and six wires (tetron and hexon designs) together with the rule that only nearby Majorana zero modes could be measured in each configuration. They then devised a strategy for TQC that optimizes robustness to quantities such as environmental temperature and noise as well the size of the network. The result of the analysis is a few scalable architectures that future experimental groups could pick between, depending on their device-construction capabilities and computational goals. The hexon architectures are likely to be computationally more efficient than the tetron architectures but will probably be more difficult to construct.
While the scope of this work might be limited to these specific devices, detailed analysis of this kind is absolutely key to motivating both experimentalists and theorists to make progress towards a realistic platform for TQC that actually works in practice. The Karzig study likely lays the foundation for analogous work with other topological platforms as they become experimentally viable candidates for TQC. I must also clarify that the significance of this work does depend on whether future experiments meet the outstanding experimental challenges, foremost among which is the reliable generation of Majorana nanowires in a topological phase. That being said, I think Karzig and co-workers paper will serve as a case study to follow, even if the properties of topological superconducting systems turn out to be somewhat different from the ones assumed.
This research is published in Physical Review B.
Jay Sau is an Assistant Professor of Physics at the University of Maryland (UMD), College Park. He holds a B.Tech. in electrical engineering from the Indian Institute of Technology (IIT) in Kanpur, India, and a Ph.D. in physics from the University of California at Berkeley. After postdoctoral positions at UMD and Harvard University, he joined the Physics Department at UMD in 2013. His research group develops theoretical tools in condensed-matter physics to predict and understand topological phases that might one day be used to perform topological quantum computation.
Torsten Karzig, Christina Knapp, Roman M. Lutchyn, Parsa Bonderson, Matthew B. Hastings, Chetan Nayak, Jason Alicea, Karsten Flensberg, Stephan Plugge, Yuval Oreg, Charles M. Marcus, and Michael H. Freedman
Phys. Rev. B 95, 235305 (2017)
Published June 21, 2017
Torsten Karzig, Christina Knapp, Roman M. Lutchyn, Parsa Bonderson, Matthew B. Hastings, Chetan Nayak, Jason Alicea, Karsten Flensberg, Stephan Plugge, Yuval Oreg, Charles M. Marcus, and Michael H. Freedman
Phys. Rev. B 95, 235305 (2017)
Published June 21, 2017
Posted: at 5:44 am
Even if youre not that into heavy science, youre probably familiar with Schrdingers cat, the thought experiment that allows us to consider quantum states in which more than one state is possible at once. The cat is in a box that is closed, and with it is a vial of poison, a hammer that can smash the vial, a geiger counter, and a trace amount of radioactive material. The radioactive material, however, is such asmall amount that the geiger counter has only a 50 percent chance of detecting it. If it does, it will trigger the hammers smashing of the vial, and the cat will die.
We wont know until we open the box if the cat is alive or dead. We just know that each possibility it getting killed or surviving is equally likely. So, until the box is open, the cat exists in a kind of super position both alive and dead. Schrdingers point was that demonstrating its impossibility and silliness. But thanks to quantum physics, we now knowits not that silly and not necessarily impossible.
Speaking of thought experiments used to talk about quantum physics that were devised by people who never even considered quantum physics, lets consider the Zeno effect and the anti-Zeno effect. Zeno of Elea was a philosopher who made it his life mission to prove that everything was B.S., and he did that by devising paradoxes to demonstrate that even things that seem obviously true to us are, in fact, false. One of these is the arrow paradox, from which arises the Zeno effect and its corollary.
The Zeno effect works like this: in order to measure or observe something at aparticular moment,it must be motionless. Say you want to see if an atom has decayed or not. In reality, although there are two possible states, most of the time the chances are not 50/50. Thats because it takes time for something to decay at least a tiny bit of time. Therefore, if you check on the atom quickly and often enough, it wont decay.The corollary anti-Zeno effect is also true. If you delay measurement until the atom is likely to have decayed, then keep this pattern going, you can force the system to decay more rapidly.
Scientists at Washington University in St. Louis wanted to know what happens if you disturb the system again and again, but dont relay any data. In other words, they wanted to see if it is the act of measurement and observation or simply the disturbing influence that causes the Zeno effect. To find out, they experimented with qubits and devised quasimeasurement,in which the atom is disturbed, but no information about it is measured or relayed.
The team found that even quasimeasurements cause the Zeno effect. The quantum environment doesnt need to be connected to the outside environment for the disturbance to achieve the effect. These findingsare interesting because they open up new areas of research into how we might beable to control quantum systems.
Oh, and by the way: no cats, philosophers, or physicists were hurt in the experiments.
Read more here:
Posted: June 21, 2017 at 4:48 am
In Brief Chinese physicists managed to demonstrate long-distance quantum entanglement in space, breaking previous records. This development, made possible by a novel method, could lead to improved information storage and transfer in the future. Spooky Action Gets to Space
When it comes to weird science stuff, quantum entanglement is probably nearthe top of the list, especially back in the days when Einstein referred to it as that spooky action at a distance. Physicists have since demonstrated the spookyphenomenon to be possible, but now theywant to extend itsreach. A new study shows its possible for quantum entanglement to spanfar longer distances than previously demonstrated.
We have demonstrated the distribution of two entangled photons from a satellite to two ground stations that are 1,203 kilometers [748 miles] apart, lead author Juan Yin, physicist at the Science and Technology University of China in Shanghai, explained in aresearch paper published in the journal Science. The previous record for entanglement distribution reached only 100 kilometers (62 miles).
Yins team used the Micius, the worlds first quantum-enabled satellite which China launched in 2016, to transmit entangled photons to several ground stations separated by long distances. They managed to achieve this feat by using laser beams to prevent the light particles from gettinglost as they traveled.
The result again confirms the nonlocal feature of entanglement and excludes the models of reality that rest on the notions of locality and realism, Yin and his colleagues wrote.
Though quantum entanglement is incredibly complex, its possible to explain itin simple terms. Two or more particles are entangled or linked when a change in ones state or properties instantaneously affects the others. What makes this stranger is that this link works regardless of distance. This phenomenon becomes particularly useful in storing information as in the case of using quantum bits (qubits) in quantum computing.
By proving that quantum entanglement can be maintained in space over such a long distance, this work paves the way for long-distance satellite quantum communication and maybe even realize the possibilities for quantum teleportation. Long-distance entanglement distribution is essential for the testing of quantum physics and quantum networks, Yins team wrote.
Advances in quantum cryptography, which rely heavily on extending entanglement, could change the way information is stored and transferred in the future opening up applications in improved security in communication and even payment systems.
View original post here:
China sets new record for quantum entanglement en route to build new communication network – NEWS.com.au
Posted: June 19, 2017 at 7:45 pm
China has used a laser on a satellite orbiting 480 kilometres above the earth to produce entangled photons and beam them to stations on the ground. Picture: Cai Yang/Xinhua via ZUMA
IN A bid to build an entirely new kind of internet completely secure and impervious to hackers China has pulled off a major feat in particle physics.
Chinese scientists have set a new distance record for beaming a pair of entangled particles: photons of light that behave like twins and experience the exact same things simultaneously, even though theyre separated by great distances.
The principle is called quantum entanglement and its one of the subatomic worlds weirdest phenomena. And China has smashed the distance record for quantum entanglement.
In a groundbreaking experiment led by Professor Jian-Wei Pan of Hefei University in China, a laser on a satellite orbiting 480 kilometres above the earth produced entangled photons.
They were then transmitted to two different ground-based stations 1200 kilometres apart, without breaking the link between the photons, the researchers said in a report published in the journal Science.
That distance achieved in the experiment is 10 times greater than the previous record for entanglement and is also the first time entangled photons have been generated in space.
Its a huge, major achievement, Thomas Jennewein, physicist at the University of Waterloo in Canada, told Science. They started with this bold idea and managed to do it.
China launched its first quantum satellite in August and if all goes according to plan will send up plenty more to create a system of communication which relies on entanglement.
A COMPLETELY NEW INTERNET
By launching a group of quantum-enabled satellites, China hopes to create a super secure network that uses an encryption technique based on the principles of a field known as quantum communication.
In physics we are trying, and we have demonstrated some encryption techniques that rely on the law of physics rather than the mathematical complexity and we call this quantum key distribution, professor Ping Koy Lam from the ANUs Department of Quantum Science told news.com.au last year, before China launched its first quantum satellite.
For that to work you need to send laser beams that carry certain information, quantum information, and then you need the senders and the receivers to get together to find a protocol to secure the communication.
The reason it cant be hacked is because the information carried in the quantum state of a particle cannot be measured or cloned without destroying the information itself.
We can show that this kind of quantum encryption works in a city radius or at most between two nearby cities, Prof Lam said.
However China believes the atmosphere in space will allow the photons to travel further without disruption because in space theres nothing to attenuate light.
In the latest experiment, both stations which received the photons were in the mountains of Tibet, at a height that reduced the amount of air the fragile photons had to traverse.
The successful characterisation of quantum features under such conditions is a precondition for a global quantum communication network using satellites that would link metropolitan area quantum networks on the ground. Picture: Google, ESASource:Supplied
A NEW SPACE RACE
Chinas ongoing progress will no doubt be watched closely by security agencies around the world.
While the spectre of a communication network enabled via quantum satellites is still a long way off, as China edges closer to the goal it has led to predictions of a new space race.
Quantum technology has been a major focus of Chinas five-year economic development plan, released in March 2016. While other space agencies have been experimenting with the technology, none have seen the level of financial support provided by Beijing.
China has not disclosed how much money it has spent on Quantum research, but funding for basic research which includes quantum physics was $US101 billion in 2015 an absolutely massive increase from the $US1.9 billion the country spent in 2005.
Scientists in the US, Canada, Europe and Japan are also rushing to exploit the power of particle physics to create secure communication systems, but Chinas latest experiment puts the country well ahead of the pack.
China launched the world’s first quantum satellite on top of a Long March-2D rocket from the Jiuquan Satellite Launch Center in northwest China. Picture: ZumaSource:Supplied
See the original post:
Posted: June 18, 2017 at 11:40 am
A Chinese satellite has split pairs of “entangled photons” and transmitted them to separate ground stations 745 miles (1,200 kilometers) apart, smashing the previous distance record for such a feat and opening new possibilities in quantum communication.
In quantum physics, when particles interact with each other in certain ways they become “entangled.” This essentially means they remain connected even when separated by large distances, so that an action performed on one affects the other.
In a new study published online today (June 15) in the journal Science, researchers report the successful distribution of entangled photon pairs to two locations on Earth separated by 747.5 miles (1,203 km). [The 18 Biggest Unsolved Mysteries in Physics]
Quantum entanglement has interesting applications for testing the fundamental laws of physics, but also for creating exceptionally secure communication systems, scientists have said. That’s because quantum mechanics states that measuring a quantum system inevitably disturbs it, so any attempt to eavesdrop is impossible to hide.
But, it’s hard to distribute entangled particles normally photons over large distances. When traveling through air or over fiber-optic cables, the environment interferes with the particles, so with greater distances, the signal decays and becomes too weak to be useful.
In 2003, Pan Jianwei, a professor of quantum physics at the University of Science and Technology of China, started work on a satellite-based system designed to beam entangled photon pairs down to ground stations. The idea was that because most of the particle’s journey would be through the vacuum of space, this system would introduce considerably less environmental interference.
“Many people then thought it [was] a crazy idea, because it was very challenging already doing the sophisticated quantum-optics experiments inside a well-shielded optical table,” Pan told Live Science. “So how can you do similar experiments at thousand-kilometers distance scale and with the optical elements vibrating and moving at a speed of 8 kilometers per second [5 miles per second]?”
In the new study, researchers used China’s Micius satellite, which was launched last year, to transmit the entangled photon pairs. The satellite features an ultrabright entangled photon source and a high-precision acquiring, pointing and tracking (APT) system that uses beacon lasers on the satellite and at three ground stations to line up the transmitter and receivers.
Once the photons reached the ground stations, the scientists carried out tests and confirmed that the particles were still entangled despite having traveled between 994 miles and 1,490 miles (1,600 and 2,400 km), depending on what stage of its orbit the satellite was positioned at.
Only the lowest 6 miles (10 km) of Earth’s atmosphere are thick enough to cause significant interference with the photons, the scientists said. This means the overall efficiency of their link was vastly higher than previous methods for distributing entangled photons via fiber-optic cables, according to the scientists. [Twisted Physics: 7 Mind-Blowing Findings]
“We have already achieved a two-photon entanglement distribution efficiency a trillion times more efficient than using the best telecommunication fibers,” Pan said. “We have done something that was absolutely impossible without the satellite.”
Apart from carrying out experiments, one of the potential uses for this kind of system is for “quantum key distribution,” in which quantum communication systems are used to share an encryption key between two parties that is impossible to intercept without alerting the users. When combined with the correct encryption algorithm, this system is uncrackable even if encrypted messages are sent over normal communication channels, experts have said.
Artur Ekert, a professor of quantum physics at the University of Oxford in the United Kingdom, was the first to describe how entangled photons could be used to transmit an encryption key.
“The Chinese experiment is quite a remarkable technological achievement,” Ekert told Live Science. “When I proposed the entangled-based quantum key distribution back in 1991 when I was a student in Oxford, I did not expect it to be elevated to such heights!”
The current satellite is not quite ready for use in practical quantum communication systems, though, according to Pan. For one, its relatively low orbit means each ground station has coverage for only about 5 minutes each day, and the wavelength of photons used means it can only operate at night, he said.
Boosting coverage times and areas will mean launching new satellites with higher orbits, Pan said, but this will require bigger telescopes, more precise tracking and higher link efficiency. Daytime operation will require the use of photons in the telecommunications wavelengths, he added.
But while developing future quantum communication networks will require considerable work, Thomas Jennewein, an associate professor at the University of Waterloo’s Institute for Quantum Computing in Canada, said Pan’s group has demonstrated one of the key building blocks.
“I have worked in this line of research since 2000 and researched on similar implementations of quantum- entanglement experiments from space, and I can therefore very much attest to the boldness, dedication and skills that this Chinese group has shown,” he told Live Science.
Original article on Live Science.
Posted: at 11:40 am
Quantum mechanics is the body of scientific laws that describe the wacky behavior of photons, electrons and the other particles that make up the universe.
Quantum mechanics is the branch of physics relating to the very small.
It results in what may appear to be some very strange conclusions about the physical world. At the scale of atoms and electrons, many of the equations ofclassical mechanics, which describe how things move at everyday sizes and speeds, cease to be useful. In classical mechanics, objects exist in a specific place at a specific time. However, in quantum mechanics, objects instead exist in a haze of probability; they have a certain chance of being at point A, another chance of being at point B and so on.
Quantum mechanics (QM) developed over many decades, beginning as a set of controversial mathematical explanations of experiments that the math of classical mechanics could not explain. It began at the turn of the 20th century, around the same time that Albert Einstein published histheory of relativity, a separate mathematical revolution in physics that describes the motion of things at high speeds. Unlike relativity, however, the origins of QM cannot be attributed to any one scientist. Rather, multiple scientists contributed to a foundation of three revolutionary principles that gradually gained acceptance and experimental verification between 1900 and 1930. They are:
Quantized properties: Certain properties, such as position, speed and color, can sometimes only occur in specific, set amounts, much like a dial that “clicks” from number to number. This challenged a fundamental assumption of classical mechanics, which said that such properties should exist on a smooth, continuous spectrum. To describe the idea that some properties “clicked” like a dial with specific settings, scientists coined the word “quantized.”
Particles of light: Light can sometimes behave as a particle. This was initially met with harsh criticism, as it ran contrary to 200 years of experiments showing that light behaved as a wave; much like ripples on the surface of a calm lake. Light behaves similarly in that it bounces off walls and bends around corners, and that the crests and troughs of the wave can add up or cancel out. Added wave crests result in brighter light, while waves that cancel out produce darkness. A light source can be thought of as a ball on a stick beingrhythmically dipped in the center of a lake. The color emitted corresponds to the distance between the crests, which is determined by the speed of the ball’s rhythm.
Waves of matter: Matter can also behave as a wave. This ran counter to the roughly 30 years of experiments showing that matter (such as electrons) exists as particles.
In 1900, German physicist Max Planck sought to explain the distribution of colors emitted over the spectrum in the glow of red-hot and white-hot objects, such as light-bulb filaments. When making physical sense of the equation he had derived to describe this distribution, Planck realized it implied that combinations of only certaincolors(albeit a great number of them) were emitted, specifically those that were whole-number multiples of some base value. Somehow, colors were quantized! This was unexpected because light was understood to act as a wave, meaning that values of color should be a continuous spectrum. What could be forbiddingatomsfrom producing the colors between these whole-number multiples? This seemed so strange that Planck regarded quantization as nothing more than a mathematical trick. According to Helge Kragh in his 2000 article in Physics World magazine, “Max Planck, the Reluctant Revolutionary,” “If a revolution occurred in physics in December 1900, nobody seemed to notice it. Planck was no exception ”
Planck’s equation also contained a number that would later become very important to future development of QM; today, it’s known as “Planck’s Constant.”
Quantization helped to explain other mysteries of physics. In 1907, Einstein used Planck’s hypothesis of quantization to explain why the temperature of a solid changed by different amounts if you put the same amount of heat into the material but changed the starting temperature.
Since the early 1800s, the science ofspectroscopyhad shown that different elements emit and absorb specific colors of light called “spectral lines.” Though spectroscopy was a reliable method for determining the elements contained in objects such as distant stars, scientists were puzzled aboutwhyeach element gave off those specific lines in the first place. In 1888, Johannes Rydberg derived an equation that described the spectral lines emitted by hydrogen, though nobody could explain why the equation worked. This changed in 1913 whenNiels Bohrapplied Planck’s hypothesis of quantization to Ernest Rutherford’s 1911 “planetary” model of the atom, which postulated that electrons orbited the nucleus the same way that planets orbit the sun. According toPhysics 2000(a site from the University of Colorado), Bohr proposed that electrons were restricted to “special” orbits around an atom’s nucleus. They could “jump” between special orbits, and the energy produced by the jump caused specific colors of light, observed as spectral lines. Though quantized properties were invented as but a mere mathematical trick, they explained so much that they became the founding principle of QM.
In 1905, Einstein published a paper, “Concerning an Heuristic Point of View Toward the Emission and Transformation of Light,” in which he envisioned light traveling not as a wave, but as some manner of “energy quanta.” This packet of energy, Einstein suggested, could “be absorbed or generated only as a whole,” specifically when an atom “jumps” between quantized vibration rates. This would also apply, as would be shown a few years later, when an electron “jumps” between quantized orbits. Under this model, Einstein’s “energy quanta” contained the energy difference of the jump; when divided by Plancks constant, that energy difference determined the color of light carried by those quanta.
With this new way to envision light, Einstein offered insights into the behavior of nine different phenomena, including the specific colors that Planck described being emitted from a light-bulb filament. It also explained how certain colors of light could eject electrons off metal surfaces, a phenomenon known as the “photoelectric effect.” However, Einstein wasn’t wholly justified in taking this leap, said Stephen Klassen, an associate professor of physics at the University of Winnipeg. In a 2008 paper, “The Photoelectric Effect: Rehabilitating the Story for the Physics Classroom,” Klassen states that Einstein’s energy quanta aren’t necessary for explaining all of those nine phenomena. Certain mathematical treatments of light as a wave are still capable of describing both the specific colors that Planck described being emitted from a light-bulb filament and the photoelectric effect. Indeed, in Einstein’s controversial winning of the 1921Nobel Prize, the Nobel committee only acknowledged “his discovery of the law of the photoelectric effect,” which specifically did not rely on the notion of energy quanta.
Roughly two decades after Einstein’s paper, the term “photon” was popularized for describing energy quanta, thanks to the 1923 work of Arthur Compton, who showed that light scattered by an electron beam changed in color. This showed that particles of light (photons) were indeed colliding with particles of matter (electrons), thus confirming Einstein’s hypothesis. By now, it was clear that light could behave both as a wave and a particle, placing light’s “wave-particle duality” into the foundation of QM.
Since the discovery of the electron in 1896, evidence that all matter existed in the form of particles was slowly building. Still, the demonstration of light’s wave-particle duality made scientists question whether matter was limited to actingonlyas particles. Perhaps wave-particle duality could ring true for matter as well? The first scientist to make substantial headway with this reasoning was a French physicist named Louis de Broglie. In 1924, de Broglie used the equations of Einstein’stheory of special relativityto show that particles can exhibit wave-like characteristics, and that waves can exhibit particle-like characteristics. Then in 1925, two scientists, working independently and using separate lines of mathematical thinking, applied de Broglie’s reasoning to explain how electrons whizzed around in atoms (a phenomenon that was unexplainable using the equations ofclassical mechanics). In Germany, physicist Werner Heisenberg (teaming with Max Born and Pascual Jordan) accomplished this by developing “matrix mechanics.” Austrian physicist ErwinSchrdingerdeveloped a similar theory called “wave mechanics.” Schrdinger showed in 1926 that these two approaches were equivalent (though Swiss physicist Wolfgang Pauli sent anunpublished resultto Jordan showing that matrix mechanics was more complete).
The Heisenberg-Schrdinger model of the atom, in which each electron acts as a wave (sometimes referred to as a “cloud”) around the nucleus of an atom replaced the Rutherford-Bohr model. One stipulation of the new model was that the ends of the wave that forms an electron must meet. In “Quantum Mechanics in Chemistry, 3rd Ed.” (W.A. Benjamin, 1981), Melvin Hanna writes, “The imposition of the boundary conditions has restricted the energy to discrete values.” A consequence of this stipulation is that only whole numbers of crests and troughs are allowed, which explains why some properties are quantized. In the Heisenberg-Schrdinger model of the atom, electrons obey a “wave function” and occupy “orbitals” rather than orbits. Unlike the circular orbits of the Rutherford-Bohr model, atomic orbitals have a variety of shapes ranging from spheres to dumbbells to daisies.
In 1927, Walter Heitler and Fritz London further developed wave mechanics to show how atomic orbitals could combine to form molecular orbitals, effectively showing why atoms bond to one another to formmolecules. This was yet another problem that had been unsolvable using the math of classical mechanics. These insights gave rise to the field of “quantum chemistry.”
Also in 1927, Heisenberg made another major contribution to quantum physics. He reasoned that since matter acts as waves, some properties, such as an electron’s position and speed, are “complementary,” meaning there’s a limit (related to Planck’s constant) to how well the precision of each property can be known. Under what would come to be called “Heisenberg’suncertainty principle,” it was reasoned that the more precisely an electron’s position is known, the less precisely its speed can be known, and vice versa. This uncertainty principle applies to everyday-size objects as well, but is not noticeable because the lack of precision is extraordinarily tiny. According to Dave Slaven of Morningside College (Sioux City, IA), if a baseball’s speed is known to within aprecision of 0.1 mph, the maximum precision to which it is possible to know the ball’s position is 0.000000000000000000000000000008 millimeters.
The principles of quantization, wave-particle duality and the uncertainty principle ushered in a new era for QM. In 1927, Paul Dirac applied a quantum understanding of electric and magnetic fields to give rise to the study of “quantum field theory” (QFT), which treated particles (such as photons and electrons) as excited states of an underlying physical field. Work in QFT continued for a decade until scientists hit a roadblock: Many equations in QFT stopped making physical sense because they produced results of infinity. After a decade of stagnation, Hans Bethe made a breakthrough in 1947 using a technique called “renormalization.” Here, Bethe realized that all infinite results related to two phenomena (specifically “electron self-energy” and “vacuum polarization”) such that the observed values of electron mass and electron charge could be used to make all the infinities disappear.
Since the breakthrough of renormalization, QFT has served as the foundation for developing quantum theories about the four fundamental forces of nature: 1) electromagnetism, 2) the weak nuclear force, 3) the strong nuclear force and 4) gravity. The first insight provided by QFT was a quantum description of electromagnetism through “quantum electrodynamics” (QED), which made strides in the late 1940s and early 1950s. Next was a quantum description of the weak nuclear force, which was unified with electromagnetism to build “electroweak theory” (EWT) throughout the 1960s. Finally came a quantum treatment of the strong nuclear force using “quantum chromodynamics” (QCD) in the 1960s and 1970s. The theories of QED, EWT and QCD together form the basis of theStandard Modelof particle physics. Unfortunately, QFT has yet to produce a quantum theory of gravity. That quest continues today in the studies of string theory and loop quantum gravity.
Robert Coolman is a graduate researcher at the University of Wisconsin-Madison, finishing up his Ph.D. in chemical engineering. He writes about math, science and how they interact with history. Follow Robert@PrimeViridian. Followus@LiveScience,Facebook&Google+.
See the article here: