Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Chess Engines
- Cloud Computing
- Conscious Evolution
- Cosmic Heaven
- Designer Babies
- Donald Trump
- Ethical Egoism
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Life Extension
- Mars Colonization
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- New Utopia
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Quantum Computing
- Quantum Physics
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Teilhard De Charden
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Daily Archives: June 15, 2017
Posted: June 15, 2017 at 9:45 pm
June 15, 2017 An international team of astronomers, led by Dutch scientists, has discovered a region in our Milky Way that contains many nitrogen compounds in the southeast of a butterfly-shaped star formation disk and very little in the north-west. This artistic impression shows the universe around the star formation area with, as an overlay, the scientists’ observations. Credit: Veronica Allen/Alexandra Elconin
An international team of astronomers, led by Dutch scientists, has discovered a region in our Milky Way that contains many nitrogen compounds in the southeast of a butterfly-shaped star formation disk and very little in the north-west. The astronomers suspect that multiple stars-to-be share the same star formation disk, but the precise process is still a puzzle. The article with their findings has been accepted for publication in Astronomy & Astrophysics.
An international team of astronomers studied the star forming region G35.20-0.74N, more than 7000 light years from Earth in the southern sky. The astronomers used the (sub)millimeter telescope ALMA that is based on the Chilean Chajnantor plateau. ALMA can map molecular gas clouds in which stars form.
The researchers saw something special in the disk around a young, heavy star. While large amounts of oxygen-containing and sulfur-containing hydrocarbons were present throughout the disk, the astronomers found only nitrogen-containing molecules in the southeastern part of the disk. In addition, it was 150 degrees warmer on the nitrogen side than on the other side of the disk.
Based on these observations, the scientists suspect that there are multiple stars forming at the same time in one disk and that some stars are hotter or heavier than others. The researchers expect the disk to eventually break into several smaller disks as the stars grow.
A few years ago, there have been observed chemical differences in a star forming region in Orion. First author Veronica Allen (University of Groningen and SRON): “The area in Orion is five times bigger than our area. We have probably been lucky because we expect that such a chemical difference to be short-lived.”
Second author Floris van der Tak (University of Groningen and SRON): “Many of the nitrogen molecules are poisonous cyanides. We do not know much about them because it is dangerous to work with those molecules in laboratories on earth.”
The astronomers are now investigating the star formation cloud in more detail. Allen: “Maybe we can see the disk break into smaller disks in real time.” In addition, the astronomers make models to see how differences in age, mass, temperature or gas density can cause a difference in chemical composition, too.
Explore further: First radio detection of lonely planet disk shows similarities between stars and planet-like objects
More information: V. Allen et al. Chemical segregation in hot cores with disk candidates. An investigation with ALMA, Astronomy & Astrophysics (2017). DOI: 10.1051/0004-6361/201629118
First radio observations of the lonely, planet-like object OTS44 reveal a dusty protoplanetary disk that is very similar to disks around young stars. This is unexpected, given that models of star and planet formation predict …
Stars form from gas and dust floating in interstellar space. But, astronomers do not yet fully understand how it is possible to form the massive stars seen in space. One key issue is gas rotation. The parent cloud rotates …
For the first time, astronomers have been able to peer into the heart of planet formation, recording the temperature and amount of gas present in the regions most prolific for making planets.
Observations led by astronomers at the University of Leeds have shown for the first time that a massive star, 25 times the mass of the Sun, is forming in a similar way to low-mass stars.
For the first time, astronomers have seen a dusty disk of material around a young star fragmenting into a multiple-star system. Scientists had suspected such a process, caused by gravitational instability, was at work, but …
(Phys.org)A team of researchers from the U.S. and Taiwan has captured the first clear image of a young star surrounded by an accretion disk. In their paper published in the journal Science Advances, the team describes …
An international team of astronomers, led by Dutch scientists, has discovered a region in our Milky Way that contains many nitrogen compounds in the southeast of a butterfly-shaped star formation disk and very little in the …
Astronomers have released an image of a vast filament of star-forming gas, 1200 light-years away, in the stellar nursery of the Orion Nebula.
China successfully launched on Thursday its first X-ray space telescope to study black holes, pulsars and gamma-ray bursts, state media reported.
For decades, scientists thought that the magnetic field lines coursing around newly forming stars were both powerful and unyielding, working like jail bars to corral star-forming material. More recently, astronomers have …
A small international team of researchers has found that water waves created due to scattering from a spinning vortex can show rotational superradiancean effect astrophysicists have predicted likely to occur in black holes, …
Researchers at the University of Texas San Antonio using observations from NASA’s Stratospheric Observatory for Infrared Astronomy, SOFIA, found that the dust surrounding active, ravenous black holes is much more compact …
Please sign in to add a comment. Registration is free, and takes less than a minute. Read more
See the article here:
Posted: at 9:45 pm
Adobe Stock photo
FILE With an impending deep solar eclipse overshadowing their efforts, the Clark Planetarium hosted a gala to foster excitement for astronomy education.
SALT LAKE CITY With an impending deep solar eclipse overshadowing their efforts, the Clark Planetarium hosted a gala Thursday to foster excitement for astronomy education.
In anticipation a solar eclipse that will be viewable across much of the United States on Aug. 21, the Clark Planetarium has renewed its efforts to offer education resources and draw excitement to its programs for students with the help of former NASA scientist Phil Plait.
“Total eclipses are rare, and we haven’t had one in the United States for quite some time,” said Tom Beckett, an organizer of the planetarium gala. “This is a great opportunity to use an astronomical event to get people interested in astronomy.”
Though Salt Lake City will not see the totality of the eclipse only a 91 percent partial coverage people may see the complete event from as close as Driggs, Idaho.
The planetarium’s gala is a fundraiser to create astronomy education resources.
Plait returned to the planetarium for his third speaking appearance. Known as the “Bad Astronomer,” he offered a keynote speech to explain the mechanics behind the eclipse and dispel some of the misunderstandings about eclipses.
“There are a lot of eclipse myths like, if you look at it, you’ll go blind,” Plait said.
Plait, who began public speaking while he was working for on the Hubble telescopes, said he sees his speaking engagements as something of a stand-up routine for science. He refers to himself as a science communicator and earned the title of the “Bad Astronomer” through his efforts at dispelling scientific misconceptions and creating humor around the concepts.
The risk associated with viewing an eclipse, he explained, comes after the roughly two-minute period of totality where the moon passes in front of the sun. That period of time allows the pupil of the eye to dilate, adjusting to the shadow cast by the moon, and the risk of injury follows as the moon continues forward, suddenly exposing the brightness of the sun once again.
Plait noted that despite this effect, he has yet to encounter a documented case of anyone becoming totally blinded by a passing eclipse.
“You can lose a little bit of your vision forever, or all of it for a short time, but your eye can heal,” he said.
Beckett said there will be educators and telescopes available at the planetarium and throughout Salt Lake County during the eclipse to accommodate viewers who are not able to drive to Idaho to see the full eclipse.
Beckett also said the planetarium will have a viewing party as the Earth comes into alignment with Saturn and the sun, creating the best chance for people to see the rings of Saturn for another 17 years.
Go here to read the rest:
Posted: at 9:45 pm
NEW YORK Smithsonian astrophysicist Martin Elvis would like to see astronomers take on a crucial role for future asteroid mining: as astronomical prospectors scoping out the next big catch.
Elvis, a researcher with the Harvard-Smithsonian Center for Astrophysics in Massachusetts, discussed his dream for applied astronomy June 4 here at the Dawn of Private Space Science Symposium. Efficientasteroidmining would jump-start a space economy and bring down costs for exploration and space science, guiding humans into a modern space age, he said.
“My basic goal is just to revolutionize our exploration of the solar system, of the universe,” Elvis said at the conference. [How Asteroid Mining Could Work (Infographic)]
Right now, he said, spaceflight and space science is unsustainably expensive. But asteroid mining could play a critical role in making those endeavors doable on a smaller budget, as private companies likeSpaceXhave decreased the launch cost per pound of payload.
But asteroid mining will face a critical problem, Elvis said: How to choose which asteroids will be worth the trip. And astronomers can play a crucial role in that determination, he said.
“The problem with asteroids is not many of them are valuable. You’ve got to find the right ones,” he said. “We want to throw away that gray, stony stuff and deal with the carbonaceous or metallic ones, depending on whether you’re looking for water or precious metals like platinum and palladium. So, this is where we [astronomers] come in.”
As an example, Elvis pointed to the twin Magellan 6.5-meter telescopes in Chile. Professional astronomers could use telescopes of that size to characterize a faint asteroid in about 1-2 minutes. Eighty-five percent of asteroids could be thrown out based just on their color, he said, and the remaining 15 percent would be good prospects for sending small, exploratory probes using the data gathered about the objects’ orbits and sizes.
Even a few nights per year would allow for the characterization of about 300 such objects, he said. And as larger telescopes come online, like the European Extremely Large Telescope and theGiant Magellan Telescope, the midsize telescopes could become more accessible for even more space-mining projects, he said.
“This means astronomers can turn out to be useful again [like] what [they] used to be, back in the days of navigation,” he said. Similar to modern-day mining on Earth, there could be a multistep process of prospecting remotely “you don’t just go straight to start digging rocks” before making a trip, Elvis added.
Such a process could cut asteroid prospecting costs by a factor of 10, he said. That would allow asteroid mining to flourish, lowering the cost commercially to put people and science in space.
On Earth, most of the precious metals, like platinum and palladium, are located 3,700 miles (6,000 kilometers) down, but they can come much nearer to the surface on asteroids. Those metals have dissolved in iron and were drawn to the center of the Earth, Elvis said, and the same thing happened on asteroids but the asteroids were then smashed up enough that it made the precious metals much more accessible. (Cometsalso contain valuable resources, especially water, Elvis said, but the energy needed to reach those fast-moving bodies makes them less worth the cost to explore.)
So far, Elvis has talked to the asteroid-mining companiesPlanetary ResourcesandDeep Space Industries, but neither company initially believed that this kind of remote prospecting would be necessary, he said.
“Both of them are dominated by engineers who are very good at building small spacecraft, and I’m sure they will succeed at building interplanetary cubesat-scale spacecraft for prospecting at the asteroid, but they were initially unbelieving of what I just told you,” Elvis said.
They might come around, though, he added. “One of the companies did eventually realize that this was a necessary precursor to their sending out satellites,” he said. “The other still isn’t interested.”
Original article onSpace.com.
EDITOR’S RECOMMENDATIONS Asteroid Basics: A Space Rock Quiz Deep Space Industries Sets Sights On Asteroids | Video Planetary Resources Unveils Asteroid-Hunting Arkyd Telescope | Video
Continue reading here:
Posted: at 9:45 pm
A cloud of gas and young stars in the Perseus molecular cloud may be revealing a strange truth to the universe: most, if not all, stars are born in pairs. This means that somewhere out there, the Sun has a lost companion and it may be one of several known stars. Essentially, all stars form in molecular clouds. In the Perseus observations, nearly all of these stars were gravitationally bound. This may be a requirement of protostars the egg-like objects could require a common center of gravity with a companion to accumulate mass. The dense cores then use leftover material to form more stars, continuing the process. So why doesnt the Sun have a binary companion (well, depending on who you ask)? It seems that 60 percent of stars shed their binary sister over time, gaining a wider distance from their partner until they are gravitationally severed. They also may not all have the same symmetry with regard to mass, meaning that some former companions could be brown dwarfs cast out by larger stars. The authors of the paper, accepted in the Monthly Notices of the Royal Astronomical Society, say more work is needed to confirm their hypothesis. But if its true, the hunt may be on for the companion the Sun once had.
Posted: at 9:44 pm
Tools are always shaped by their uses. When cloud computing first came on the scene, it was a form of hosted virtualization, and its goal was to look like a bare-metal server.
Infrastructure as a service (IaaS) shaped the earliest cloud services, and it still dominates public cloud as well as the private cloud software market. Even so, that doesn’t mean it’s going to be the source of future cloud opportunity.
Cloud providers are always strategizing for the future, and their plans reveal an important — and already underway — shift. Every major public cloud provider has added services to process events. In particular, providers are adding features to help developers build applications for the internet of things (IoT). Could these be the basis for the most transformational set of applications to come along since the internet itself?
Legacy applications follow a pattern that’s decades old: Work comes to the applications that support it. In traditional cloud computing, users pay for the processing resources they use. The terms differ, but it’s essentially a lease of virtual infrastructure. This is a direct mirror of what happens in a data center — a server farm is loaded with applications and transactions are routed to the correct server in the pool. This approach is great where work is persistent, as in the case of a retail banking application that runs continuously.
Event-driven and IoT apps change this critical notion of persistence. An event can pop up anywhere, at any time. It would be wasteful, perhaps prohibitively wasteful, to dedicate an IaaS instance to wait around for an event. Or the instance might reside within a data center halfway around the world from where the event occurs. If all possible event sources were matched with traditional cloud hosting points, most would sit idle much of the time, doing little but running up costs.
The reason why there’s a specific right or wrong place to process events is simple: delay. Most events have a specific response-time expectation. Imagine a machine that triggers spray paint when an item passes a sensor. Picture a self-driving vehicle that approaches a changing traffic light.
The information flow between an event and the receipt of the appropriate response is called a control loop. Most events require a short control loop, which means that their processes need to be close to the point of the event. That’s the problem with control loops that force event-handling processes to disperse out toward the cloud edge — and multiply in numbers.
It’s easy to see how the scarcity of events at a given point creates a problem of cloud efficiency and pricing for traditional cloud computing. It’s also possible to have too many events. The cloud can allow for cloud bursting, or scaling capacity by spinning up multiple copies of an application component on demand, but it’s not that easy.
Few applications written to run on a bare-metal server can seamlessly scale or replace failed instances. Those cloud capabilities aren’t common in data centers, where legacy applications run. Moving the applications to the cloud doesn’t add the features necessary to scale applications, either.
Multiple copies of an application component require load balancing, and many applications were not designed to allow any copy of a component to handle any event or request. Applications that work by assuming a string of requests in context can’t work if half the string goes to one copy of the application and the other half to another. How do we make IoT apps scalable and resilient? They have to be rewritten.
Developers are doing just that, and big cloud providers are responding. In particular, they all see the same IoT-and-event future for the cloud. They have been steadily enhancing their cloud offerings to prepare for that future. Not only do the cloud giants offer special web services to manage IoT devices and connections, they now provide tools to support the kind of programming that IoT apps will demand.
The functional or lambda style of programming doesn’t allow an application or component to store data between uses. As a result, all instances of the component can process an event. Cloud providers now offer functional or microservice support instead of simply providing infrastructure, platform or software as a service, because a function cloud is very different.
Where is your function hosted in a function cloud? Everywhere. Nowhere. Functions are activated anywhere they’re needed — when they’re needed — and you pay when you use one. Function clouds for IoT, or any kind of event-driven computing, represent the ultimate flexibility and agility. They also demand that users take care to establish policies on just how much function hosting they are willing to pay for, a decision they’ll have to make based on the combination of cost and those pesky control-loop lengths.
Amazon has even allowed for the possibility that IoT will demand cloud applications that migrate outside the cloud. Their Amazon Web Services (AWS) Greengrass platform is a software and middleware framework that lets users execute AWS-compatible functions on their own hardware. This capability will let IoT users do some local processing of events to keep those control loops short, but still host deeper, less time-critical functions in the AWS cloud.
The old cloud model made you pay for hosting instances. In the function cloud, you don’t host instances in the usual way. You have extemporaneous execution of functions, as needed. This is what gives rise to the pay-as-you-go or serverless description of the function cloud, but that’s short of the mark. You could price any cloud computing service, running any application, on a usage basis, but that doesn’t make those cloud services scalable or easily optimized. Without these features, serverless is just a pricing strategy.
Developers will have to make changes in applications to accommodate IoT and function clouds. Almost every new program or service stores information, and this makes it difficult to scale. The rule of functional programming is stateless, meaning that the output you get from a process is based only on what you provide as input. There are even programming languages designed to enforce stateless behavior on developers; it’s not second nature.
The notion of the function cloud is likely to accelerate a trend that’s already started in response to the use of mobile devices and implementation of BYOD policies. Companies have found that they are creating application components designed to format information for mobile devices, interface with apps written for a variety of mobile platforms and provide consistent support from back-end applications often running in data centers.
These forces combine to create a two-tier model of an application. The device-handling front tier lives in the cloud and takes advantage of the cloud’s ability to distribute applications globally. The cloud part then creates traditional transactions for the core business applications, wherever they are.
IoT is even more distributed than mobile workers, and some IoT events need short control loops. As a result, cloud hosting of the front-end part of applications could see explosive growth. That puts pressures on the off-ramp of this kind of two-tier application structure because many events might generate many transactions. These transactions can overwhelm core business applications. Cloud providers are working on this, too. Microsoft, for example, has a cloud-distributed version of the service bus typically used to feed business applications with work.
If you’re writing functions for any reason, isn’t using a function cloud inevitable?
Given that IoT is in its infancy — and cloud IoT is even younger — it’s easy to wonder why cloud providers are already offering IoT features. There are three reasons. First, IoT could radically increase IT spending, and cloud providers want to grab some of that as potential new revenue. Second, IoT isn’t the only thing that generates events. A lot of mobile worker interaction, for example, looks like event processing. Finally, functional programming techniques are being promoted for every kind of processing. IoT apps demand them. Developer tools and conferences are already describing how functional programming techniques can make programs better and more maintainable.
If you’re writing functions for any reason, isn’t using a function cloud inevitable?
That’s the big question that every cloud provider and cloud user needs to think about. Fully scalable applications — ones that can increase or decrease capacity under load and repair themselves by simply loading another copy — are very useful to businesses. The functional programming techniques developed for IoT apps, and the function clouds to support those techniques, will remake programs.
Tools are defined by their uses, remember? Well, users are already seeing the cloud of the future in event processing, and IoT will accelerate that trend. IoT’s potential to generate events over a wide area, in large numbers, while demanding short control loops will revolutionize cloud use.
Learn the benefits of runtime as a service
Are you ready for serverless computing?
Build IoT apps for cloud in a flash
See the article here:
Posted: at 9:44 pm
40 years ago a personal computer cost around $500k in todays money and was accessible only to large corporations. Today, as the clich goes, that kind of processing power is available to most people in an affordable mobile phone. Quantum computing, however, is a different matter. Quantum computers are stuck in the 1950s: there arent many of them, they cost tens of millions of dollars, and they take up entire rooms.
One of todays very rare and very costly quantum machines is being developed by D-Wave Systems Inc., a company whose CEO happens to be Vern Brownell, a former CTO of Goldman Sachs. Goldman is one of several lead investors in D-Wave, which its described as having a head start in the field. While most quantum computing rivals are still in their infancy, D-Wave has already been using its system for machine learning. Competitors are eyeing the same plot: 1QB Information Technology Systems Inc (1QBit), a Vancouver-based quantum computing, counts derivatives exchange CME Group among its investors. An RBS banker who led 1QBits 2015 finance round toldBloombergquantum computing is perfect for the data-rich time-sensitive world of financial markets.
Interestingly, therefore, an opportunity has arisen to write machine learning algorithms for quantum computers and then implement them using D-Wave 2000Q, the companys first commercially available quantum computer. Training on the system will be made available too.
The quantum machine learning program is being run by the Creative Destruction Lab (CDL), a seed funding program for science-based companies based in Toronto. Last month, it invited applications for 40 places on an initiative intended to develop and sponsor a wave of quantum machine learning start-ups. The next (and last) round of applications closes on Monday July 24th.
Daniel Mulet, associate director of machine learning at CDL says theyve already received 42 applications, around 10% of which are biased towards financial services. Some are very early stage and have been submitted by students, but others are companies that have already been launched, says Mulet. – Theres one thats working with a hedge fund looking for patterns with trading data.
Traditional computers use binary code to solve problems: a bit can be a 1 or a 0. Quantum computers use qubits: a bit can be a 1 or a 0 or a 1 AND a 0 As Bloomberg points out, therefore, if you have two qubits you can have four potential states: 00, 01, 10, and 11. Moreover, the number of states a quantum computer can take into consideration is2 raised to the power of the number of qubits: if you had a 50-qubit universal quantum computer, you could explore1.125 quadrillion states simultaneously.
Quantum computers are able to process much larger quantities of data much faster, says Mulet. Its our belief that these new quantum hardware platforms built by D-Wave or IQB will be used for various machine learning applications in the next few years. When that happens, we want to be ready to leverage that. One day all Bloomberg terminals will be run on quantum computers.
Its not hard to see why Goldman is interested.
If youre interested too and want to apply, you have 39 days to polish your application. As a further lure to candidates, those selected will be mentored by the likes of William Tunstall-Pedoe, a Cambridge AI entrepreneur, and Barney Pell, chief strategy officer at San Francisco-based Loco-Mobi (which is applying AI to parking your car).Those graduating from the program, which begins in September, will receive $80k in funding in return for 8% of the equity in their company.
Mulet says ideal applicants will have a Masters or PhD in a quantitative subject, and be proficient in programming in Python and the use of Tensor Flow, Googles open source library for machine learning.
Photo credit:Quantum foambyAlex Sukontsevis licensed under CC BY 2.0.
Read more here:
Posted: at 9:44 pm
Accenture has partnered with quantum software startup 1QBit to develop a quantum-enabled molecular comparison application for US multinational biotechnology firm Biogen.
The application is expected to improve advanced molecular design to speed up drug discovery for complex neurological conditions such as multiple sclerosis, Alzheimer’s, and Parkinson’s.
Researchers at Accenture Labs worked with 1QBit to create the new application, which enhances Biogen’s existing molecule comparison method through quantum computing.
Molecular comparison is a crucial part of early-phase drug design and discovery, Accenture explained, and involves intensive computational methods to review molecule matches and predict the positive effects of a therapy or drug while reducing negative side effects.
As quantum computing has the potential to find the answer to complex problems millions of times faster than classical computing by leveraging the properties of quantum physics, Accenture said the new application provides insights into the molecular comparison process as well as much deeper contextual information about how, where, and why molecules match.
This is expected to enable scientists and researchers to analyse large collections of molecules more quickly and cost effectively.
“At Biogen, we’re always looking to harness cutting-edge technologies that push the boundaries of traditional pharmaceutical research to discover new treatments and cures for complex neuroinflammatory and neurodegenerative conditions,” Govinda Bhisetti, head of Computational Chemistry at Biogen, said.
“Collaborating with researchers at Accenture Labs and 1QBit made it possible to rapidly pilot and deploy a quantum-enabled application that has the potential to enable us to bring medicines to people faster.”
Accenture Labs said it has identified more than 150 use cases with clients where quantum computing would be relevant, and is working with clients across multiple industries to prepare for the arrival of mainstream quantum computing.
Also on Friday, Accenture expanded its partnership with SAP to include working with SAP Leonardo, the ERP giant’s digital innovation system that combines differentiating software capabilities in machine learning, the Internet of Things (IoT), big data, analytics, and blockchain on its SAP Cloud Platform.
Accenture will be integrating more than 50 of its enterprise analytics applications with SAP Leonardo, spanning finance and accounting, supply chain, procurement, human capital management, and sales and customer service.
“Today, we’re at an incredible tipping point,” said Pierre Nanterme, Accenture chairman and CEO. “We’re face to face with an era of tremendous business transformation where the fundamental rules of how we create value are being rewritten. What we’re announcing today is a bold step in defining the rules for the intelligent enterprise.”
The companies first began working together 18 months ago on SAP S/4HANA, aiming to simplify and fast-track the “digital journeys” of its clients. SAP and Accenture had partnered back in 2010 for Business ByDesign and in 2014 expanded their global alliance through an agreement to offer cloud-based offerings designed for industry-specific and technology-enabled operations.
Consortium Applies Quantum Computing to Drug Discovery for Neurological Diseases – Drug Discovery & Development
Posted: at 9:44 pm
Accentureand quantum software firm1QBitcollaborated with Biogen to develop a first-of-its-kindquantum-enabled molecular comparison applicationthat could significantly improve advanced molecular design to speed up drug discovery for complex neurological conditions such as multiple sclerosis, Alzheimers, Parkinsons and Lou Gehrigs Disease.
Researchers atAccenture Labscollaborated with 1QBit to create the new application which enhances Biogens existing molecule comparison method with quantum capabilities. Molecular comparison is a crucial part of early-phase drug design and discovery, and involves intensive computational methods to review molecule matches and predict the positive effects of a therapy or drug while reducing negative side effects.
By leveraging quantum computing a computing paradigm that has the potential to find the answer to complex business problems millions of times faster than classical computing by leveraging the properties of quantum physics the new application provides novel insights into the molecular comparison process as well as much deeper contextual information about how, where and why molecules match. This is expected to enable scientists and researchers to analyze large collections of molecules more quickly and cost effectively.
At Biogen, were always looking to harness cutting-edge technologies that push the boundaries of traditional pharmaceutical research to discover new treatments and cures for complex neuroinflammatory and neurodegenerative conditions, said Govinda Bhisetti, Head of Computational Chemistry, Biogen. Collaborating with researchers at Accenture Labs and 1QBit made it possible to rapidly pilot and deploy a quantum-enabled application that has the potential to enable us to bring medicines to people faster.
In just over two months, Accenture Labs, Biogen and 1QBit progressed from an exploratory conversation about quantum business experimentation to an enterprise-ready, quantum-enabled application that generates molecular comparison results with deeper insights about shared traits. As quantum computers become more readily available, it will become easier for pharmaceutical companies to identify and develop new medicines for a wide range of diseases and conditions, saidJeff Elton, Ph.D., managing director, Accenture Strategy, Life Sciences.
Accenture Labs is focused on helping clients across multiple industries prepare for the arrival of mainstream quantum computing, which offers great potential to solve challenges in entirely new ways through quantum-enabled optimization, sampling, and machine learning algorithms, said Marc Carrel-Billiard, senior managing director, Accenture Labs. Through our collaboration with Biogen, 1QBit and our colleagues in the AccentureLife Sciences industry group, we have achieved a breakthrough that confirms the speed and accuracy of the quantum-enabled method for molecular comparison and takes another significant step toward improving the pharmaceutical industrys drug discovery and design process to help deliver better patient and economic outcomes more efficiently.
According to theAccenture Technology Vision 2017companion survey of more than 5,400 business and IT executives, 40 percent of respondents are taking proactive steps to prepare for quantum computing, with 36 percent planning to invest in quantum capabilities in the next two years.
Playing a key role in Accentures overallInnovation Architecture, Accenture Labs helps clients harness emerging technologies to change the way the world works and lives. Given the potential for quantum computing to disrupt the computing landscape in the next two to five years, helping clients identify opportunities and begin working with quantum computing to stay ahead of the broader introduction and deployment of associated technologies is a key focus area. Accenture Labs has already identified more than 150 use cases with clients from portfolio optimization in the financial services sector to production scheduling in manufacturing where quantum computing would be relevant.
View original post here:
Chinese satellite breaks a quantum physics record, beams entangled photons from space to Earth – Los Angeles Times
Posted: at 9:43 pm
Chinese scientists have just set a record in quantum physics.
For the first time, pairs of entangled photons have been beamed from a satellite in orbit to two receiving stations almost 1,500 miles away on on Earth.
At the same time, the researchers were able to deliberately separate the entangled photon pairs along a greater distance than has ever been recorded.
The experiment, described Thursday in the journal Science, represents the first measurable proof of an idea that has long been theorized but never tested, experts said.
This is the first time you have a quantum channel between a satellite and the ground that you can actually use, said Norbert Ltkenhaus, a professor at the Institute for Quantum Computing at the University of Waterloo in Canada who was not involved in the new work. People have been talking about doing it for many, many years, but these guys actually did it.
Keep reading to learn what this new work means, and why it matters.
Great question. For starters, a photon is a tiny particle of light. In fact, it’s the smallest unit that light can be broken into. It has no mass and no charge.
Entangled photons are a pair of photons whose properties are linked, and remain that way no matter how far apart they get.
If you make a measurement on one of the photons, you get a perfectly correlated outcome on the other member of the pair, Ltkenhaus said.
And that will hold true not matter how many times you look at them.
One measurement alone doesnt tell you they are entangled, you need to repeat it many times, he said. With entangled photons no matter what you measure, or how many times you measure, or which side of the pair you measure, you always get perfect correlation.
Another great question. This one is more difficult to answer.
Scientists have not been able to explain why entanglement occurs. All they know is that it exists.
Einstein referred to the phenomena of entanglement as spooky action at a distance. Others have said it is kind of like the physics version of voodoo.
They built a special satellite to do it.
The spacecraft, nicknamed Micius after a famous 5th century Chinese scientist, launched in August 2016.
It is loaded with a special crystal that can split a single incoming photon into two daughter photons with joint properties. For this experiment, instruments on the satellite separated the entangled photons and sent them to different receiving stations on Earth.
To do this, Micius had to aim at its targets with an amazing degree of precision, said Jian-Wei Pan, a physicist at the University of Science and Technology of China who led the work.
Its the equivalent of clearly seeing a human hair at a distance of 900 feet away, he said.
It is extreme. And, experts say, challenging.
Designing, launching and operating a satellite with this capability is no easy feat, Ltkenhaus said. I see this as a great engineering triumph.
But, as the study demonstrates, using a satellite to send beams of entangled photons to Earth is a better strategy than using optical fibers to distribute them.
The greatest distance scientists have been able to separate entangled photons using optical fibers is 62 miles. By sending the entangled photons through space, Pan and his team were able to separate entangled photons by more than 620 miles.
Not immediately, but eventually, it probably will.
For example, distributing entangled photons over large distances could be used to establish unhackable communications via whats known as quantum cryptography.
This application relies on another strange aspect of quantum mechanics namely that the simple act of observing a photon disturbs it and causes it to change its orientation.
Scientists have already been able to establish secure, quantum channels using fiber optics, but there is a limit to how far those can stretch.
Using the space-based quantum channel, the authors have shown it is possible to significantly extend the distance over which one can perform such a secure communication, said Jrgen Volz, a physicist at the Vienna Center for Quantum Science and Technology who was not involved in the work.
In the time of the Internet, when more and more sensitive information is shared and exchanged via the web, this is of tremendous importance, he said.
But experts say an application like that may still be 10 years away.
Although the experiment was successful, the rate of sending and receiving entangled photons described in the paper was still quite low. Of nearly 6 million entangled photon pairs generated by Micius each second, only one pair was detected at stations here on Earth.
The communication rates here are not yet sufficient for a practical application, said Wenjamin Rosenfeld, a physicist at the Ludwig-Maximilians University in Munich.
However, he added that the mission represents a proof-of-principle demonstration of a quantum communication protocol that could be available in the near future.
Pan put it this way: This is the first baby step for quantum entanglement experiments going into space. It is really new!
Do you love science? I do! Follow me @DeborahNetburn and “like” Los Angeles Times Science & Health on Facebook.
MORE IN SCIENCE
How oxygen-producing pond scum could save your life after a heart attack
Scientists use meteorites to show that Jupiter is almost as old as the solar system
These 12 Americans had the right stuff to be picked for NASA’s new class of astronaut trainees
Here is the original post:
Posted: at 9:43 pm
New York Times
Moving to Scuttle Obama Legacy, Donald Trump to Crack Down on Cuba
New York Times
WASHINGTON President Trump on Friday will move to halt the historic rapprochement between the United States and Cuba set in motion by former President Barack Obama, delivering a speech in Miami in which he plans to announce he is clamping down …
Trump to unveil new Cuba travel restrictions in aim to slam regime's human rights record
Donald Trump's Cuba policy restricts travel, business with military
Donald Trump's new hardline policy on Cuba is yet another gift to Russia
See the original post here: