Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Chess Engines
- Cloud Computing
- Conscious Evolution
- Cosmic Heaven
- Designer Babies
- Donald Trump
- Ethical Egoism
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Life Extension
- Mars Colonization
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- New Utopia
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Quantum Computing
- Quantum Physics
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Teilhard De Charden
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Daily Archives: June 1, 2017
Posted: June 1, 2017 at 11:10 pm
There are better Star Trek video games out there, but none that come close to delivering the immersive cooperative experience offered by Ubisoft’s virtual reality-powered Star Trek: Bridge Crew. It’s the closest you can get to being on the bridge of a Federation starship.
Announced at E3 2016 and delayed a couple of times since, Star Trek: Bridge Crew gives one to four players the ability to live out their science fiction fantasies on the bridge of the U.S.S. Aegis. Players take up their posts at the helm (driving), tactical (scanning and weapons), engineering (giving her all she’s got) or the captain’s chair (barking orders).
Once the crew is assembled they can embark on a series of missions that will test the limits of their piloting, combat, and general bullshitting skills, just like the television shows.
You can play Star Trek: Bridge Crew by yourself, hopping from station to station with a click of a button. Less than four players can group up, with AI taking over whichever positions are unmanned. But the game is best with a full crew compliment.
You can also play as the original TOS Enterprise, but it’s all switches and buttons. As one crew member put it, “It’s like driving a Model T.”
As with many virtual reality games, getting started is a little awkward. Players gather in the ship’s ready room before launching into a machine, seated at four sides of a table in a room filled with cool Star Trek things they can’t touch (why can’t we play 4D Chess?) Arms flail about awkwardly as crew positions and missions are selected. The game relies heavily on voice communication, so expect plenty of goofy Trek humour between missions.
So far I’ve yet to play with anyone who wasn’t completely awesome. These guys were great.
Once a mission gets underway, the silly things suddenly aren’t quite as ridiculous. In a full game everyone has a role to fill, and everybody needs to be on point to make it work. The helmsman’s manoeuvrability and warp capability depends on how much power the engineer delivers to essential systems.
The tactical officer can’t scan mysterious objects in space until the helmsman gets the ship in close. And nobody has all of the information at their fingertips except for the captain, who needs to keep up with mission objectives that change quickly depending on the situation.
When it all comes together, it goes a little something like this:
Note that this is not me playing with a group of friends (or at least they weren’t friends when we started). This is me playing on my Oculus Rift with like-minded strangers. Maybe they’re playing on PlayStation VR or the HTC Vive, but we know our roles (mostly) and work together like Starfleet professions.
The mission above involves a great deal of stealth. Save for an early mishap involving a cloaked Klingon ship, we managed to get in and out of some incredibly sticky situations without being detected. With me at the helm we skirted the edge of the Klingon sensors.
Hundreds of years into the future and we’re still using touch screens.
Tactical analysed local anomalies, discovering one that helped obscure us from patrolling enemies.
I am bad at tactical. I tend to shoot things that didn’t need shooting.
Engineering kept power to the engines low to ensure we were running as silently as possible. The captain sat behind us all, conducting the mission like the symphony section of a grand space opera, only with less passion and more science.
Did a bunch of civilians get vaporised by our tactical officer? Sure, but the needs of the many often outweigh those of the few. Maybe they shouldn’t have been hanging around military technology too valuable to let fall into Klingon hands.
Being able to see your teammates working and talking makes dire situations feel slightly less so. The lipsync tech here is pretty sweet.
You may also notice that I keep communications mostly formal in the video, referring to the captain by rank. Sometimes OK, a lot of times I speak in a pseudo Sulu voice. It’s not a conscious decision on my part. It’s the whole Star Trek vibe, dragging me off into deep space.
Here’s another video I did that’s not featured on our main YouTube channel for reasons that will be pointedly evident. Stupid mouse cursor.
Star Trek: Bridge Crew is exactly what I was hoping it would be an immersive simulation that delivers an experience fans have been dreaming about for decades.
Please log in or register to gain access to this feature.
Kotaku Australia has learned that JB Hi-Fi stores from around Australia will begin advertising a special Xbox bundle from tomorrow, termed “Race Wars”.
Friday the 13th has a lot in common with some of the sillier sequels in the film’s franchise. it’s great for a laugh, but not much else.
See the article here:
Posted: at 11:10 pm
Our sun was still dim. Waves crashed on martian beaches. Life was emerging on Earth.
Thats when the ghosts of two dead stars black holes dozens of times more massive than our sun merged in a far-off corner of the universe. In their final moments, these binary black holes were circling each other hundreds of times per second, as each one spun at 10 times that rate.
The rumbles of distant thunder from that collision reached Earth on Jan. 4 of this year, passing through the detector at the Laser Interferometer Gravitational-Wave Observatory (LIGO) in Hanford, Washington. Then, traveling at the speed of light, this wrinkle in space-time passed through LIGOs second detector in Livingston, Louisiana, just a fraction of a second later.
The results were published Thursday in the journalPhysical Review Letters.
Gravity is the weakest among natures four fundamental forces. So only extreme cosmic events like supernovas, neutron stars and merging black holes can make detectable gravitational waves. The waves are so weak that theyd warp the distance between Earth and sun by just the width of a hydrogen atom. But as these waves pass through LIGOs twin detectors, its enormous lasers can pick up on the truly tiny stretches and squeezes of space-time. You can think of it like a seismometer for measuring mini quakes in the cosmos gravitational fabric.
When LIGO gets a hit, the gravitational wave makes a characteristic signal that scientists call a chirp because of the sound it makes once translated into a format human ears can hear.
This was the third such detection since Albert Einstein first predicted gravitational waves a century ago as part of his general theory of relativity, or theory of gravity. Taken together, these observations form the first samples of a black hole census with far-reaching implications.
Before colliding, the binary black holes spotted earlier this year weighed in at 19 and 31 times our suns mass. After merging, the pair created a single black hole 49 times more massive than the sun. Einsteins equations tell us that energy and mass are interchangeable. And so the missing solar mass worth of energy was radiated out across the universe as gravitational waves.
And with this detection, scientists for the first time think the two black holes might have been spinning in opposite directions. That could reveal clues about the lives of the stars that formed them. Its possible that the two stars lived in a dense stellar cluster.
Before LIGO, astronomers didnt know that so-called solar mass black holes, which form when stars die, could reach such extreme sizes.
This census can also help explain an enduring mystery in astronomy. Scientists have seen supermassive black holes that dominate entire galaxies, as well as small black holes that form after stars die. We even now know about so-called intermediate mass black holes weighing as much as thousands of suns. But how do these all form? Do many small black holes combine intro larger and larger behemoths? LIGO is just starting to piece together this puzzle.
Posted: at 11:10 pm
When researchers want to take pictures of very small things, like individual molecules, they have to get creative.
When scales shrink to seemingly imperceivable levels, images must be captured usingindirect techniques that record how the subject being photographed interacts with its environment. One way to do this is byobserving how a beam of particles disperses around the object. Working backward, researchers can then infer what the object in question looks like.
The particle beams that do the heavy lifting for this kind of imaging require sophisticated equipment to create. At theSLAC National Accelerator Laboratoryat Stanford University, their linear accelerator stretches out for two miles, focusing beams of charged electrons onto minuscule targets at extremely intense energies. In apaperpublished Tuesday inNature,SLAC researchers observed peculiar behavior among atoms subjected to their X-ray beam, and theyre calling it a molecular black hole.
TheLinac Coherent Light Source (LCLS) at SLAC is used to take pictures of organic molecules and biological processes that take place at scales of only a few atoms. Abeam of electrons bounces off the molecules in a predictable way, giving researchers an idea of their structure. This happens in the brief instant before the sample is destroyed by the electron beams intense energy, something the researchers call diffraction before destruction. Understanding how the molecules behave as the beam passes through is critical to obtaining precise measurements.
Working with atoms of xenon and molecules containing iodine atoms, the researchers saw something unexpected occur. The beam ripped through the outer shells of the atoms and stripped away the innermost electrons, leaving a gaping void between the nucleus and the outer electrons. The overwhelmingly positive charge this created then sucked in all of the surrounding electrons with enough strength to not only gather its own electrons, but also steal them away from surrounding atoms.
As predicted by the laws of physics, this kind of electron theft doesnt happen in nature because the forcesinvolved are too great. Done fast enough, and with enough power, however, the naked nuclei overwhelm the grip of neighboring atoms and siphon off electrons, in a process, the researchers say, that is similar to a black hole consuming a star.
When we have really,really intenseXrays like we do theres enoughXrays that you knock outone electron andbefore theres time for recombination youknock off anotherand then knock offanother and so on and so forth, saysLCLS staff scientist and study co-author Sebastien Boutet. What that endsup doing is stripping most of the inner shells and then that very highly chargedmolecule unexpectedly suckedin a bunch of electrons from neighboring atoms as a consequence.
The molecular versiondoesnt work the same way as a cosmic black hole, which relies on immense gravitational forces to suck in matter, but the observed effect is similar. Understanding how the beam interacts with atoms of this size, which often show up in their experiments, will help researchers fine-tune their images. The accelerator is currently undergoing an upgrade which will allow for a drastic increase in the number of beam pulses per second, expanding the machines imaging capacity.
The more precision researchers can achievewhile working at scales of just a few hundred nanometers, the more they will see.
This article originally appeared on Discover.
See original here:
Posted: at 11:10 pm
SPECIAL TO THE ORACLE
From the twinkling lights of the stars to the glow of a full moon, students have the opportunity to enjoy the heavens with the astronomy club at Riverfront Park.
All students are welcome as prior knowledge of astronomy is not required.
It doesnt matter if youre a physics major, it doesnt matter if you have an astronomy minor, it doesnt matter if youre not a (science, technology, engineering and mathematics) major, said Kyle Denny, a junior majoring in physics and president of the astronomy club. You could be anything and you could come join the astronomy club. It is open to anyone who just wants to connect and learn about the universe and appreciate it.
We do a lot of events, too, when planets are in opposition, said Kami Malestein, a junior majoring in physics and astronomy club vice president.
The astronomy club hosts many activities such as stargazing, full moon watching and eclipse viewing. Students with telescopes or binoculars are encouraged to bring them.
One of Dennys favorite events was Mercurys transit in May of last year.
We watched the planet Mercury go in front of the sun, Denny said. we had a great turnout for that one.
The number of students at an event varies from five to 10 people on stormy nights to a hundred for occurrences such as the Mercury transit.
An even larger turnout is expected for the upcoming solar eclipse on Aug. 21 the viewing location on campus is yet to be determined.
Although it wont be a total eclipse visible over USF, there will be a partial phase. Eighty percent of the sun will be blocked out, and itll be on the very first day of school, Denny said. People are going to stop by and wonder whats going on with the sun, so they get a chance to look at the sun in a really spectacular event.
Most of the clubs events take place at Withlacoochee River Park or Riverfront Park, with transportation through students driving themselves or joining a carpooling list.
The astronomy club is one of a few clubs that remain active during the summer. Their next event is scheduled for the next full moon June 9 at Riverfront Park.
The times that there are not thunderstorms, the Milky Way is nice and prominent in the night sky. You can see it from horizon to horizon, Denny said. Its a really inspiring experience. So, the summer is probably the best time to really look up at the night sky and really appreciate it.
See the original post here:
Posted: at 11:10 pm
The University of the Virgin Islands College of Science and Mathematics, together with the Etelman Observatory, are organizing two upcoming astronomy conferences this summer. The first one, Generation-GW: Diving into Gravitational Waves will take place from June 5-9. The second conference, Unveiling the Physics Behind Extreme AGN Variability will take place from July 11-14. Both conferences will facilitate discussions about crucial breakthroughs in the field of astronomy over the last few years.
We are establishing a legacy, and these events will improve the recruitment of Virgin Islands students to study physics and astronomy at UVI, said Dr. Antonino Cucchiara, assistant professor of physics. The conferences will also demonstrate how research and activities undertaken at UVI can benefit the community.
The scientific breakthrough to be discussed by groups of international astrophysicists from around the world at the June conference is Gravitational Waves. Widely considered to be the greatest discovery of 21st century astronomy, this phenomenon describes ripples in the curvature of space-time that propagate at the speed of light, outward from their source.
The other discovery to be discussed by more than 50 astronomers at the July conference is Fast Variable Active Galactic Nuclei (AGN). The center of every galaxy has a super massive black hole which is millions of times heavier than our sun. Everything that gets too close to it or falls in is destroyed, explained Cucchiara. That destruction produces energy that is observable in optical, X-ray, gamma-ray radiation producing an AGN. The July conference will focus on Fast Variable AGNs, which radiation changes quickly in time and are therefore difficult to observe in detail.
Both conferences will include an undergraduate mentoring component with question and answer sessions, as well as a talk that will be open to the public. The public talk for the June conference is set for 7 p.m. on Thursday, June 8, in the Administration and Conference Center (ACC). It will feature Professor Alberto Sesana from the University of Birmingham in the United Kingdom, and Professor Jillian Bellovary from Queensborough Community College in New York. The public talk in July will also be held on a Thursday; details to be announced.
UVI and the Etelman Observatory are establishing a path forward to become an astronomy research hub, said Cucchiara. It is important for us to involve not just UVI physics faculty, but also international partners, undergraduate researchers and federal agencies. Eight UVI students will be at the National Aeronautics and Space Administration (NASA) working on a variety of projects, from building the new generation of microsatellite, to studying planets around other stars, to studying the most powerful stellar explosions known in the Universe. Some of these projects relate to research that is currently being pursued at UVI, representing the strong connection between both institutions.
See more here:
Posted: at 11:10 pm
[Artist’s conception of a black with material swirling around it in an accretion disk, and also a jet of matter blasting away from it. Until recently, it was thought that a star had to supernova to create a black hole, but evidence is mounting it may not. Credit:NASA/JPL-Caltech]
One of the basic truisms in astronomy is that, when a massive star ends its life, it goes out with a bang. A big one. A supernova.
This titanic explosion is triggered when the star runs out of nuclear fuel in its core. The core collapses in a heartbeat, and the energy generated in that collapse is so immense that it blows the outer layers off. This explosion is so colossal it can outshine an entire galaxy! In the meantime,the collapsed core can form an exotic neutron star, or may even squeeze itself down into a black hole.
Now, Ive skipped some steps there, but thats the general picture (if you want more, check out my Crash Course Astronomy episode on high mass stars and supernovae). If you want a black hole, you have to blow up a massive star.
Except, maybe not. It turns out theres a loophole that could allow a star to bypass the supernova part. It collapses directly down to a black hole without the explosion. Some energy is released, but not much compared to a supernova, and in the end what you get is a now-you-see-it-now-you-dont situation: The star is there, and then suddenly … it isnt.
The idea of a failed supernova is an interesting theoretical astrophysical problem, and one scientists have been working on for a while now. But theres been a new an exciting development: Astronomers now think theyve seen one!
The star in question is called N6946-BH1, and it was found in a very cool survey specifically designed to look for failed supernovae. Using the Large Binocular Telescope in Arizona, 27 galaxies all within about 30 million light-years of Earth were observed over and over again. Each image was painstakingly compared to the others to look for transients: objects that have changed brightness. Even using rather stringent criteria, thousands were found stars change brightness for a lot of reasons,but most are not due to them going supernova … or, in this case, failing to supernova.
Eventually,the number of interesting objects was whittled down to just 15. Six of them turned out to be run-of-the-mill exploding stars (if the titanic explosion of a few octillion tons of star screaming outward at a substantial fraction of the speed of light can be called ho-hum), but nine of them turned out to be more interesting.
Of these, all but one were likely unusual events, like two stars merging, which can cause a very big (and very pretty) eruption, but again falls short of the outcome of a massive star dying. When all was said and done, after searching 27 galaxies for seven years, only one object was left: N6946-BH1.
In earlier images, the star is there, clearly seen in the galaxy NGC 6946, a lovely face-on spiral galaxy roughly 20 million light-years away (and one that has had no fewer than 10 recorded supernovae in the past century; by coincidence one was seen just this year). Then, in later images, its gone. Like, gone: Disappeared. Poof.
If it had exploded as a supernova it wouldve been seen in the images. Instead, in 2009, it briefly got somewhat brighter, glowing at about a million times brighter than the Sun; then it faded so much it was only about 2% of its previous brightness (that is, pre-collapse) by 2015. And yes, in human terms, a million times the Suns luminosity is terrifyingly bright, but in terms of a supernova, its barely worth mentioning; a typical one will shine many billions of times brighter than the Sun! So this was, at best, a bit of a pop.
So, how do we know it wasnt some sort of weird supernova, maybe obscured by lots of dust in the host galaxy? This material is dark and opaque, and can completely block the light from even a normal supernova. Follow-up observations using Spitzer Space Telescope should reveal that, because infrared light can pierce through the dust. Spitzer did see some IR light from the event, roughly 20003000 times the Suns luminosity. Again, thats a lot, but nowhere near what youd expect from a supernova. Even a stellar merger would produce more than that.
It really looks like whats left is what the astronomers had been looking for all along: a failed supernova.
If true, this is very interesting, indeed. Why? Because of physics.
It takes a massive star to explode; it has to have enough pressure in the core (caused by the mass of the star above it squeezing down on it) to fuse successively heavier elements over time. First,hydrogen fuses into helium.Then, when that runs out, helium is fused into carbon, and so on, until the core builds up iron. When iron fuses, it doesnt release energy; it absorbs it. Thats a big problem, because its that release of fusion energy that holds the star up (in a similar fashion that hot air causes a balloon to expand). Once the star tries to fuse iron, the core collapses. If the core has a mass up to about 2.8 times that mass of the Sun, it forms a neutron star, but if it has more, it forms a black hole.
And in general, either way, the core collapse triggers the supernova in the outer layers, and kaboom.
But thats where this gets funny. It may not always happen that way. For a range of core masses, theoretical calculations show that the explosion may stall. The outer layers get a decent kick, but not a huge one. They blow off, but its a more gentle event than the unfettered violence of a supernova.
That depends on a lot of factors, actually, but it tends to happen when the total star mass is roughly 25 times that of the Sun. Looking at the observations of N6946-BH1, thats just about the mass it had.
And theres more. We see lots of high-mass stars in galaxies being born, but there arent enough supernovae seen to account for them all. That implies failed supernovae happen relatively often.
Also, when we look at the masses of neutron stars and black holes, we find theres a gap between them; the lowest-mass black holes are still considerably more massive than the highest-mass neutron stars. If all these compact objects formed from regular supernovae, youd expect there to be a smooth transition. Thats because, in a supernova, a lot of the material in the star still lingers near the core, and that can fall back on the newly formed neutron star. If theres enough, the neutron star will then collapse to form a low-mass black hole. So youd expect to see lots of black holes right at the lower mass limit. But we dont.
Ah, but in the failed supernova scenario, theres a lot more material left over there wasnt enough energy in the event to blow away all the outer layers. This comes crashing back down and adds its mass to the neutron stars, making a far more massive black hole. So, in reality, the existence of failed supernovae explains a lot of different phenomena.
And now, very likely, weve seen one! More observations would be nice, though. For example, a newly formed black hole should emit lots of X-rays, as material heats up before falling in. If we see those X-rays, that would go a long way in understanding what were seeing.
And again, this is the first one that weve seen. Given the number of supernovae that were detected in the survey, it implies that something like 14% of all high-mass star deaths result in failed supernovae. If thats the case, then we need more eyes on the sky looking for these events. Supernovae are what create and distribute elements literally vital to our existence: iron, calcium and more. Without them, you and I would literally not exist.
In my opinion, that makes these events very much worthy of our study. Even when they fail.
See the rest here:
Posted: at 11:09 pm
Indranil Gupta, an associate professor in the Department of Computer Science at the University of Illinois Urbana-Champaign, recalled the first time he offered a free Coursera online class on Cloud Computing Concepts in the spring of 2015. In the first class, Gupta said, Coursera registered a total of 179,000 enrollees from 198 countries.
That shows you how much interest there is, he said. It seems like every single country has some students who are interested.
Guptas assessment matches numerous reports that interest in cloud computing among students had skyrocketed, and courses in computer science departments throughout the nation were increasingly becoming commonplace. However, a recent report by Clutch, a Washington, D.C. based B2B and research firm, found that there were still concerns among universities and professors regarding the cost of teaching cloud computing. Riley Planko, a content developer at Clutch who authored the report, noted that while individual courses and certification programs were increasingly available, undergraduate and Masters programs were still developing.
For the cost, there was definitely optimism. Theres potential with regulation, and learning how to manage this, that its something that can be more more under control by the university, she said. It still a young field. Its only been around in its true power for a couple of years.
Higher education institutions have been interested in storing data on cloud servers for several years, and as the Clutch report indicates, cloud computing skills are in high demand by corporations, and increasingly, public institutions (LinkedIn found that knowledge in cloud computing was the most desirable skill in job applicants among employers, according to the report).
Kevin McDonald, the founder and managing director of GreyStaff Group, LLC, also teaches a cloud computing course in the Technology Management Masters program at Georgetown Universitys School of Continuing Studies. He said the sea change cloud computing brought to public and private industry was now benefitting individual startups. By eliminating the need for expensive server infrastructure and IT staff, new companies can significant cut their upfront costs, building their entire infrastructure in the cloud. It is an opportunity McDonald echoes in his course, with teams visualizing and building a phone app within a matter of weeks before presenting it to the class; some had even sought investors for their creations.
Its a total revolution under our feet, so as weve developed the program, weve tried to keep it in the real world, he said, marveling at the fact that students come up with an idea, and go through a startup and are able to present to a venture capitalist within six weeks.
Gupta agreed there was an ongoing transition amongst higher education institutions on how to offer cloud computing courses integrated in disciplines, instead of in isolation, and he detailed a Masters of Computer Science in Data Science currently offered by UIUC. The MCS-DS is an online program with a $19,200 tuition, offering students the ability to proceed at their own pace, and Guptas Coursera class in Cloud Computing Systems is integrated into the degree.
Gupta said that while there is always a period of transition where professors in a particular discipline may wonder whether a new facet of the discipline should be integrated or is merely temporal, he was optimistic about how computer science had quickly warmed to introducing cloud computing and big data into curricula.
Cloud computing as it is today is new, but many of the systems in cloud computing have been around for decades, he said. Many of the building blocks have been around for a long time, its just that its become more available and accessible to students.
Gupta also said the imposing costs of accessing cloud storage for student use could be alleviated by partnering with companies that offer free or reduced-price resources for students, citing that Amazon Web Services ran a program for several years that would offer $100 worth of credit for proposed research projects.
The company currently offers AWS Educate for institutions, educators and even individual students, touting access to company technology, training resources and open-source content for educational use. Much of UIUCs work, Gupta said, was done with Microsoft Azure due to a mutual partnership. He said students benefitted from the cloud space, while industries could see benefits once students enter the workforce.
Companies want students who are more familiar with the state of the technology, so they need as little training as possible when they join, he said. They know that all our students are smart; its whether they have the necessary skills or need extra training. If Microsoft has students use Microsoft Azure courses, theyre kind of already training them.
McDonald, who is also the author of Above The Clouds: Managing Risk In The World Of Cloud Computing, said government, after some lag time, was catching up to private industry in the adoption of cloud technology. The Federal Cloud First Initiative, instituted in 2010 by the Obama administration, had led to the closure of more than 3,000 data centers as of April 2016, with a goal of closing 5,203 federal data centers in total by 2019, almost half of the 2010 number.
He said cloud computing, like many burgeoning computer science fields, was increasingly viewed as interdisciplinary, asserting that while the School of Professional Studies valued the technical processes inherent in cloud computing, the increased accessibility of cloud storage for novice users lowered the complexity barrier for interested students.
Its gotten to that level of simplicity where we dont need to worry about that unless were turning out system engineers, he said. Thats always been the philosophy for this program since day one.
In addition to cost concerns, Pankos report found that some professors expressed concern with how to appropriately teach cloud computing in a rapidly-changing field, and also said the lack of necessary staff at universities that could be a hindrance.
Nevertheless, the report concluded that it would be worthwhile for colleges and universities to at least consider the topic for future implementation in their curricula.
Posted: at 11:09 pm
Box Inc. is accomplishing its current goal of generating cash from its cloud-software business, and Chief Executive Aaron Levie has plans for more changes down the road, including an artificial-intelligence effort.
After reporting fiscal first-quarter earnings Wednesday afternoon, Levie chatted with MarketWatch for about 10 minutes about the path Box BOX, +9.52% has traveled since its 2015 initial public offering, where the enterprise online-storage company goes from here, and how his sneaker game has changed. The interview has been edited for length and clarity.
MarketWatch: Since the IPO, Box has been able to maintain solid revenue growth, but the last two quarters you have generated positive free cash flow for the first time, which you had targeted. Is that the biggest change for the company financially since going public, and what else has been important so far?
Aaron Levie: I think thats a very key point. I would say that, overall, weve been building a cloud content-management platform for a little over a decade, and whats starting to happen is larger and larger enterprises are adopting Box as their core system of record for securing and managing and governing and organizing their corporate information. Were seeing customers basically do larger transactions with us, buying more seats of the service for their user base, and add on additional products like our Box governance capabilities and some of our advanced security technology.
So basically whats happening, were continuing to move more and more upmarket, were getting more efficient over time with our sales force, and were growing a larger base of customers, which obviously produces a larger recurring revenue base, which then drives more efficiency from an operational standpoint, and thus generating free cash flow. So I think whats happening is as you see deal sizes go up and transactions go up and our own internal productivity improve, youre seeing the economics of the business really kind of start to take hold. This is obviously what we had always been building into the business model, but it wasnt always as clear, like when we first went public, that this is what it was going to turn into. I think thats what is starting to happen within the numbers.
MW: A question provided by a person who tweets about Box even more than you, Alex Wilhelm from CrunchBase: How does positive free cash flow impact the business and how do you balance revenue growth with the focus on cash generation?
AL: It hasnt been any kind of significant change as much as just our own evolution as a company. Were now around 1,600 employees, we operate around the world, we have 74,000 customers, so theres a whole bunch of things that as we scale up as an organization not the least of which is going public where we have just become more operationally rigorous. So as were scaling up, it makes sense to ensure we have a sustainable business model that doesnt require outside capital, which is why the cash flow elements to the business are so incredibly important. But it hasnt restricted our growth, were just making sure that we execute as effectively as possible and that were driving that growth in as efficient of a way as possible. I think thats what youre starting to see show up in the business. I dont think were trading off that much from a top-line standpoint, but ultimately were building a much healthier organization and a much healthier business.
MW: Whats the next milestone beyond cash generation? Is actual GAAP profitability ahead? Youve discussed $1 billion in annual revenues, is there a target year for that? Are there other serious financial goals?
AL: Yeah, we are on a path to $1 billion in revenue over the next few years, thats probably the most significant next major medium-term milestone, so obviously this years financial metrics are going to be incredibly important to ensuring were on that path. We guided to more than $500 million in revenue this year, so the $1 billion mark is the next significant material milestone that we kind of have a flag in the ground on.
Dont miss: How artificial intelligence will affect your job
MW: When you went public, you talked a lot about how Box was capitalizing on the transition to cloud and mobile, and said that kind of major transformative change in tech happens every 10-15 years. Do you see another of those changes on the way?
AL: Yeah, I think the most significant technology were seeing is artificial intelligence. We think that the impact of AI within the enterprise is going to be enormous and were quite excited about some upcoming announcements we have that will at least point to where Box will be going in the space. I obviously cant reveal too much, but needless to say, we think that AI is going to be substantially powerful for the future of work, and we want to make sure were embedding intelligent experiences into everything we do and everything we build at Box.
MW: Any big changes in your sneaker game since the IPO? You using your cash to move up to some limited edition Yeezys or anything?
AL: No, getting pretty boring on the sneaker front, unfortunately. Im becoming a little more post-IPO in my sneaker choices. Still sneakers, but less, lets say, colorful.
Box shares have gained 35% in 2017, while the S&P 500 SPX, +0.76% has gained 8%.
Read the original:
Posted: at 11:09 pm
Will the profitability of AWS (Amazon Web Services) decrease over time (to near zero) because the service is basically a commodity? originally appeared on Quora: the place to gain and share knowledge, empowering people to learn from others and better understand the world.
Answer by Mathew Lodge, San Francisco tech executive, on Quora:
The premise of the question is flawed: Amazon Web Services is nothing like a commodity. I do expect that the profitability of AWS will decline at some point due to competitive intensity specifically from Microsoft Azure and Google Cloud Platform but that really isnt the same thing, and it isnt happening yet.
For over nineyears now theres been a narrative about AWS that says an IaaS cloud is just a convenient place where you can run some virtual machines on demand. The saying The cloud is just computers that belong to someone else embodies this idea. And because one rented virtual machine is much like another, the theory goes, a VM service like AWS is just a commodity like other fungible on-demand services such as electricity.
Peddlers of this narrative felt emboldened when AWS kept cutting VM prices in the days before we could see any financials about AWS. Surely this constant price erosion was evidence of the commodity nature of AWS?
There are two problems with this narrative:
From the outset of AWS, Amazon was building itself a new platform for building and deploying distributed applications. While it intended to eventually use this platform for Amazon.com, it fully intended to sell it to other people too. AWS was never spare capacity not being used by the retail site an enduring myth that just wont die.
The death blow to the commodity narrative should have happened when Amazon started breaking out AWS balance sheet in April 2015. Amazon revealed a breathtakingly profitable business with a balance sheet that looks totally unlike a commodity service, while also demonstrating a 49% growth rate that most multi-billion dollar businesses only ever get to dream about.
AWS EBITDA (Earnings Before Interest, Taxes, Depreciation and Amortization) is about 50%. For comparison, the best EBITDA Rackspace ever achieved as a hosting/cloud provider was 28%. So AWS is nearly twice as profitable as one of its most efficiently run public company predecessors in the hosting/cloud business. [I am using EBITDA because its the best way to compare profitability of capital-intensive businesses.]
Azure and Google Cloud Platform have incredibly competitive basic compute services. Googles compute service is vastly more flexible than AWS. Yet neither is badly denting AWS growth rate. Why? Because the code running inside of that VM needs to actually get stuff done, and AWS has a very broad and increasingly deep set of complementary services that software developers can tap into.
Angela Zhang () does a great job of explaining how well AWS does this, and how unlike a commodity AWS is, in her answer. Stan Hanks articulates The promise that means switching costs are high for the millions already using AWS, and for millions of new users who want not to screw things up by choosing the wrong cloud platform.
AWS and Microsoft are battling for control of the next great app platform.
Many people have been surprised that after years of brutally battling all-comers for server operating system revenue share with Windows Server, Microsoft has embraced Linux and done everything it can possibly do to encourage development of cloud apps on Linux on Azure.
Why the sudden charge of heart? Satya Nadella realized before many others that the battle for app developer mindshare was slipping away from the OS to the cloud and specifically the API of the cloud that it ran on. When your app dependencies are all on cloud services provided by an IaaS like AWS, then winning the OS battle doesnt win you much if they just go run the app on AWS.
This question originally appeared on Quora – the place to gain and share knowledge, empowering people to learn from others and better understand the world. You can follow Quora on Twitter, Facebook, and Google+. More questions:
Start your workday the right way with the news that matters most.
Read the original:
Mary Meeker: Healthcare technology is booming thanks to cloud computing and wearables – SiliconANGLE (blog)
Posted: at 11:09 pm
Kleiner Perkins Caufield & Byers partner and longtime tech analyst Mary Meeker released her annual Internet Trends Report Wednesday, and more than anything else, she pointed to a transformation of health thanks to big data and cloud computing.
The report, which is highly regarded in the tech community for its insights into trends and predictions, dedicated 31 pages to Healthcare @ The Digital Inflection Point and came up with some amazing stats about how technology and the Internet are transforming the sector.
At the top of the list, and perhaps the most remarkable number, is the way data is helping develop new medicines. Meeker said the digitization of medical data means that medical knowledge now doubles every 3.5 years versus doubling only every 50 years in 1950. Meeker added that the increased availability of health datais helping to accelerate clinical trials and encouraging collaboration with the scientific community as well.
That data accumulation, which Meeker describes as Digitally Native Health-Related Data Sets, comes from many sources, not only from medical establishments themselves but directly from consumers with the proliferation of wearable devices. According to her numbers, global wearable shipments hit 102 million in 2016, a figure five times higher than 2014, and a remarkable 25 percent of Americans now own a wearable device with more likely to buy in the future.
That data requires sharing, and some companies have earned more trust more than others in handling it. Google Inc. was trusted by 60 percent of those polled to handle health data, while Microsoft Corp. and Samsung Electronic Co. Ltd. were notfar behind at 56 percent and 54 percent, respectively. At the other end, consumers didnt trust Amazon.com Inc. and IBM Corp. nearly as much, with the companies only being trusted by 39 and 37 percent of people.
The surge in wearables has also been matched by a surge in health-related apps, with downloads hitting more than 1.2 billion in 2016. The types of apps were split across the health spectrum, with the most popular, fitness, sitting at 36 percent followed by disease and treatment at 24 percent and lifestyle and stress at 17 percent.
All the advances in healthcare technology wouldnt have been possible without growing cloud computing support. The cloud got its own section, with the report noting that Cloud Adoption = Reaching New Heights + Creating New Opportunities.
Although traditional data center spending still accounted for the majority of global information technology infrastructure spend in 2016, the type of spending is changing. Private and public cloud infrastructure accounted for 37 percent of total spending last year, versus 23 percent in 2013. Going forward, Meeker notes a survey that indicates that many enterprises are considering cloud adoption, with 57 percent of respondents saying they planned to run appson Amazon Web Services alone, with growing support for Microsofts Azure at 37 percent.
Heres a full copy of the report:
Read the original here: