Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Conscious Evolution
- Cosmic Heaven
- Designer Babies
- Ethical Egoism
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Life Extension
- Mars Colonization
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- New Utopia
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Teilhard De Charden
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Tag Archives: utility
Posted: February 20, 2017 at 6:51 pm
Cryptocurrency hardware wallet KeepKey announced in a press release sent to CoinReport that it has integrated cryptocurrency dash.
Those using KeepKey can now receive, store and send their dash on their KeepKey devices, and via the built-in ShapeShift function, swap between other digital currencies like bitcoin, dogecoin, namecoin and litecoin.
Dash currently has a market capitalization of over $120 million and stands seventh on the most valuable digital asset list. In September and October last year, the median 24-hour volume trading of the cryptocurrency was about $680,000. In November and December, the amount almost doubled. Now, in the first quarter of 2017, its averaging about $2 million per day.
KeepKey founder & CEO Darin Stanchfield
Partnering with Dash is the natural next step for KeepKey since our wallet is purely focused on security, mobility and convenience; attributes that Dash shares, said KeepKey CEO Darin Stanchfield in the release.
KeepKey protects digital assets from hackers by limiting their exposure to the internet. With this integration, we are extending our utility, and adding one more asset users can transfer to or from directly on our device.
Dash director of finance Ryan Taylor said on the occasion, As the ecosystem of services integrated with Dash continues to grow and mature, solutions to ensure user funds are stored as securely as possible is more important than ever. Because of Dashs unique capabilities and the announcement and execution of our software upgrade, weve experienced triple digit price appreciation, so many of our users will love having this option available to secure the growing value of their Dash.
We partnered with KeepKey because we aim to create the best overall experience for our users. The peace of mind and convenience that KeepKey can provide makes them a valuable addition to our family of partners. Their products also align well with our belief that the user experience should be a focus.
This beta release by KeepKey, however, will not support dashs InstantSend and PrivateSend functions.
We like to listen to the community and implement new technology in response to community requests and feedback. While hacks in todays global information age are pervasive, and so too is the rise of cryptocurrency, more and more people are investing in a digital vault for their valuable cryptocurrencies. Crypto assets vary in their characteristics, and attract the use of different audiences for different reasons. One thing we are certain of though is the trend for investing and holding multiple types of assets is becoming increasingly popular and KeepKey helps makes this both simpler and safer, Stanchfield added.
The entire digital currency market has almost tripled in value in a year, from $7 billion in January 2016 to more than $19.9 billion now.
Images courtesy of KeepKey via PR firm Wachsman PR
Posted: February 12, 2017 at 7:08 am
Mary Hansen Staff Writer @maryfhansen
In 2015, City Water, Light and Power came to the Springfield City Council with a problem. The public utility was at risk of a technical default on its bonds and needed a quick infusion of cash.
The council eventually approved what amounted to a $4.4 million bailout for CWLP. City officials then went to work on refinancing the electric funds debt, renegotiating its coal contract and restructuring electric rates, which they see as largely successful efforts to turn around CWLPs finances.
With the city facing dropping revenues and a tough budget year, some aldermen say now its time for the utility to pay back at least some of that $4.4 million.
But utility and city officials warn that a transfer could set back the progress theyve made. City lawyers are looking intoif its legally possible.
The situation is in fact the reverse of what it was two years ago, said Ward 7 Ald. Joe McMenamin, who is pushing for the transfer of $1.3 million from CWLP to the citys main account, which is called the corporate fund and pays for most non-utility city services.
The electric division was suffering financially and the corporate fund was growing, he said. Now the electric division is very healthy and corporate fund is suffering because of a downturn in the local economy.
Worries about rating agencies
Mayor Jim Langfelder has often touted CWLPs turnaround, including its stable bond rating, as a key accomplishment for his administration and the council that took office in 2015. He has said a payback could send the wrong message to credit rating agencies.
Chief engineer Doug Brown echoed these concerns, saying the agencies would have a negative outlook on the transfer and it could lead to a credit downgrade. Credit ratings determine how much interest the utility pays on its bonds.
It will take a very long time to recover from this action and counter another negative outlook, Brown wrote in an emailed statement. Any ratings downgrade is an increase in costs to our customers, the citizens of Springfield.
CWLPs contracted financial analyst told the utility any transfer would trigger a credit review by an agency, according to spokeswoman Amber Sabin.
A spokesman for Moodys Investors Service declined to comment so early in the discussion on the proposal but said Moody’s would be monitoring the situation.
In the fall of 2015, Moodys improved its outlook on the CWLP electric funds finances, changing it from negative to stable, just after the council voted change the way the utility charged customers.
In its report, the agency said that if officials stopped supporting improvements to CWLPs financial position, it could lead to a downgrade of the utilitys credit rating.
The $1.3 million transfer could be seen as weakening support, Sabin said.
But McMenamin argued that the transfer amount is relatively small compared to the utilitys more than $300 million budget.
I think the bond rating agencies are looking at broader trends than a $1.3 million transfer, he said. Theyll be looking at continued strength of the debt coverage ratio and continuing reserves of electric division, which is whats happening.
He pointed to a recent quarterly update from the utility that put its monthly reserves at $18 million in November.
Still, according to the utility, the standards set by rating agencies for utilities CWLPs size is having $33.9 million cash on hand.
McMenamin has introduced two measures to repeal the 2015-bailout ordinances, which the council could discuss Tuesday at its committee meeting.
At the time, the ordinances waived the utilitys payment in lieu of taxes, which is money CWLP pays into the corporate fund, and instituted a refund of previous payments if necessary to ensure that CWLP had enough money to meet its debt coverage ratio for the fiscal year.
Payments were waived or refunded enough for the utility to meet their obligations that year. But, Sabin warns, if the money was paid back, auditors could revise that fiscal years books, triggering a review and potential downgrade from rating agencies.
Plus, the first ordinance stated that the amount would not have to be repaid, Brown pointed out.
But the council has the power to change what the previous council passed and should do so because the financial situation has changed, McMenamin countered.
That should be an option on the table for the mayor, McMenamin said. There should be a full repayment if the electric division continues to grow more financially healthy and if there is a need.
— Contact Mary Hansen: 788-1528, email@example.com, twitter.com/maryfhansen.
Read this article:
Posted: February 9, 2017 at 6:17 am
Citing cost and the availability of cheap space at the landfill in which to bury toxic materials, Casper City Council voted on Tuesday to effectively end its legally mandated electronic waste recycling program.
Council rejected a five-year contract with Electronic Recyclers International, based in Aurora, Colorado, despite a city ordinance passed in 2009 that bars Casper from dumping electronic waste in its landfill.
It probably started as a feel-good measure, said councilman Chris Walsh. If we stop, it can go in our lined landfill.
Electronics can contain lead, chromium, cadmium, mercury, beryllium, nickel, zinc and brominated flame retardants, the website states. When electronics are not disposed of or recycled properly, these toxic materials can present problems.
Walsh and other council members cited the annual $57,400 cost of the five-year contract, despite solid waste division manager Cynthia Langstons clarification that the city would pay that amount only under the worst-case scenario.
It looks to me like were spending $57,000 on a measure thats more politically correct than it is necessary for us, Walsh said. Over the term of this contract, were going to save a quarter million dollars.
Langston had clarified at councils pre-meeting that the actual payment would likely be around $25,000 per year.
She said that dumping electronics in the citys landfill instead would cost $10,000 to $15,000 per year.
In an interview Wednesday, Langston said that she had miscalculated and the cost would be closer to $4,000 per year to dispose of electronics in the landfill, meaning the city would save about $20,000 per year by rejecting the contract.
Council members did not have the information when they voted against the agreement.
At the pre-meeting, Walsh said he was prepared to take what he saw as the politically unpopular position of opposing recycling.
Nobody wants to say that, he said. I say smash it with a bulldozer.
Langston also said that the recycling programs cost was already covered by the approximately $28,000 in annual fees paid by residents earmarked for recycling electronics.
Ending the program will also affect other cities, like Rawlins, which pay Casper to dispose of their residents trash and recyclables.
Councilman Charlie Powell said that while he supported recycling in theory, it was better for Casper to use its landfill rather than truck the electronics to Colorado.
We have enough land to run the landfill for another 1,000 years, Powell said. We can bury a lot of trash in Casper.
Walsh also said that because not all of the electronics that would be shipped to Colorado could be recycled, he would prefer they go into the local landfill.
Langston told council that some parts of certain electronics, like wood panels on old stereo systems, had to be thrown away. But she said 96 percent of the waste would be recycled.
Langston said city residents had demanded an electronics recycling program in the early 2000s after the issue of children picking toxic materials out of old American computers and cellphones in developing countries gained national attention.
You saw the little kids and they were melting the electronics and it was really bad for the environment, she said.
Still, Langston said cutting the program would be an easy way to save money during a budget crunch.
At a time when you want to cut budgets, recycling is what you should cut first, she said.
Walsh speculated that since Casper residents paid a 12-cent monthly fee for the recycling program as part of their utility bill, the city might be able to pass a rate decrease if it began dumping the electronics in its landfill.
Langston said that since the council banned dumping electronic waste in the landfill in 2009, a local organization that helps people with disabilities had recycled the electronics at a discount as a way to provide jobs for that population.
But Northwest Community Action Programs of Wyoming lost several hundred thousand dollars in federal funding this year and was forced to end its recycling program a few weeks ago.
Mayor Kenyne Humphrey asked whether councils rejection of the contract would disrupt operations at the solid waste facility given the citys existing ban on putting electronics in the landfill.
City attorney Bill Luben pointed out that council would need to vote three times to repeal the ordinance. He said council could temporarily approve the contract since it could be cancelled at no charge with 30 days notice.
If you dont move forward with this, Im not sure what the timing is for items to build up, Luben said.
Langston said if her facility filled up, she would ask city manager V.H. McDonald to landfill some of it, despite the ban on doing so.
She acknowledged in an interview that McDonald would be violating city law by allowing her to do that and said he could also instruct her to store the waste in public storage space around Casper.
Langston said the facility would likely reach capacity in the next three to four weeks.
The soonest the ordinance could be repealed would be March 21.
Langston said that according to the Wyoming Department of Environmental Quality, the 80 tons of electronics waste the city receives each year could be placed in Caspers lined landfill, which has a physical barrier between the pit and the ground so that toxic materials do not drain directly into the North Platte River watershed.
Powell said the 80 annual tons was a tiny fraction of the 400 tons of waste the landfill collects per day.
The $20,000 the city is likely to save by cancelling the electronics waste program makes up .002 percent of the the sanitation divisions roughly $11 million annual budget.
Council members rejected a motion by councilman Jesse Morgan to postpone a vote on the contract until city staff could explore other, less expensive options for safely disposing of electronics.
I dont think well gain much information that would change anybodys mind, said councilman Bob Hopkins. This is just not a winner.
Langston clarified on Wednesday that she was personally in favor of the recycling program, which she noted was initially advocated for by local residents.
If they really want it, they need to tell their council people, Langston said. We absolutely cover the cost [of the program] through that 12 cents per month charge to citizens. We can do it thats not the issue.
See the original post here:
Posted: December 26, 2016 at 3:02 pm
Build automation is the process of automating the creation of a software build and the associated processes including: compiling computer source code into binary code, packaging binary code, and running automated tests.
Historically, build automation was accomplished through makefiles. Today, there are two general categories of tools:
Depending on the level of automation the following classification is possible:
A software list for each can be found in list of build automation software.
Build automation utilities allow the automation of simple, repeatable tasks. When using the tool, it will calculate how to reach the goal by executing tasks in the correct, specific order and running each task. The two ways build tools differ are task orient vs. product-oriented. Task oriented tools describe the dependency of networks in terms of a specific set task and product-oriented tools describe things in terms of the products they generate.
Although build servers existed long before continuous integration servers, they are general synonymous with continuous integration servers, however a build server may also be incorporated into an ARA tool or ALM tool.
Automation is achieved through the use of a compile farm for either Distributed compilation or the execution of the utility step. The distributed build process must have machine intelligence to understand the source code dependencies to execute the distributed build.
Build automation is considered the first step in moving toward implementing a culture of Continuous Delivery and DevOps. Build automation combined with Continuous Integration, deployment, application release automation, and many other processes help move an organization forward in establishing software delivery best practices.
The advantages of build automation to software development projects include
Read more from the original source:
Posted: December 7, 2016 at 8:06 am
An Invitation to Enter a New Field of Physics
by Richard P. Feynman
This transcript of the classic talk that Richard Feynman gave on December 29th 1959 at the annual meeting of the American Physical Society at the California Institute of Technology (Caltech) was first published in Caltech Engineering and Science, Volume 23:5, February 1960, pp 22-36. It has been made available on the web at http://www.zyvex.com/nanotech/feynman.html with their kind permission. The scanned original is available.
The Wikipedia entry on Feynman’s talk.
Information on the Feynman Prizes
Search YouTube for Richard Feynman
For an account of the talk and how people reacted to it, see chapter 4 of Nano! by Ed Regis, Little/Brown 1995. An excellent technical introduction to nanotechnology is Nanosystems: molecular machinery, manufacturing, and computation by K. Eric Drexler, Wiley 1992.
The Feynman Lectures on Physics are available online.
I would like to describe a field, in which little has been done, but in which an enormous amount can be done in principle. This field is not quite the same as the others in that it will not tell us much of fundamental physics (in the sense of, “What are the strange particles?”) but it is more like solid-state physics in the sense that it might tell us much of great interest about the strange phenomena that occur in complex situations. Furthermore, a point that is most important is that it would have an enormous number of technical applications.
What I want to talk about is the problem of manipulating and controlling things on a small scale.
As soon as I mention this, people tell me about miniaturization, and how far it has progressed today. They tell me about electric motors that are the size of the nail on your small finger. And there is a device on the market, they tell me, by which you can write the Lord’s Prayer on the head of a pin. But that’s nothing; that’s the most primitive, halting step in the direction I intend to discuss. It is a staggeringly small world that is below. In the year 2000, when they look back at this age, they will wonder why it was not until the year 1960 that anybody began seriously to move in this direction.
Why cannot we write the entire 24 volumes of the Encyclopaedia Brittanica on the head of a pin?
Let’s see what would be involved. The head of a pin is a sixteenth of an inch across. If you magnify it by 25,000 diameters, the area of the head of the pin is then equal to the area of all the pages of the Encyclopaedia Brittanica. Therefore, all it is necessary to do is to reduce in size all the writing in the Encyclopaedia by 25,000 times. Is that possible? The resolving power of the eye is about 1/120 of an inch that is roughly the diameter of one of the little dots on the fine half-tone reproductions in the Encyclopaedia. This, when you demagnify it by 25,000 times, is still 80 angstroms in diameter 32 atoms across, in an ordinary metal. In other words, one of those dots still would contain in its area 1,000 atoms. So, each dot can easily be adjusted in size as required by the photoengraving, and there is no question that there is enough room on the head of a pin to put all of the Encyclopaedia Brittanica.
Furthermore, it can be read if it is so written. Let’s imagine that it is written in raised letters of metal; that is, where the black is in the Encyclopedia, we have raised letters of metal that are actually 1/25,000 of their ordinary size. How would we read it?
If we had something written in such a way, we could read it using techniques in common use today. (They will undoubtedly find a better way when we do actually have it written, but to make my point conservatively I shall just take techniques we know today.) We would press the metal into a plastic material and make a mold of it, then peel the plastic off very carefully, evaporate silica into the plastic to get a very thin film, then shadow it by evaporating gold at an angle against the silica so that all the little letters will appear clearly, dissolve the plastic away from the silica film, and then look through it with an electron microscope!
There is no question that if the thing were reduced by 25,000 times in the form of raised letters on the pin, it would be easy for us to read it today. Furthermore, there is no question that we would find it easy to make copies of the master; we would just need to press the same metal plate again into plastic and we would have another copy.
This method might be very slow because of space charge limitations. There will be more rapid methods. We could first make, perhaps by some photo process, a screen which has holes in it in the form of the letters. Then we would strike an arc behind the holes and draw metallic ions through the holes; then we could again use our system of lenses and make a small image in the form of ions, which would deposit the metal on the pin.
A simpler way might be this (though I am not sure it would work): We take light and, through an optical microscope running backwards, we focus it onto a very small photoelectric screen. Then electrons come away from the screen where the light is shining. These electrons are focused down in size by the electron microscope lenses to impinge directly upon the surface of the metal. Will such a beam etch away the metal if it is run long enough? I don’t know. If it doesn’t work for a metal surface, it must be possible to find some surface with which to coat the original pin so that, where the electrons bombard, a change is made which we could recognize later.
There is no intensity problem in these devices not what you are used to in magnification, where you have to take a few electrons and spread them over a bigger and bigger screen; it is just the opposite. The light which we get from a page is concentrated onto a very small area so it is very intense. The few electrons which come from the photoelectric screen are demagnified down to a very tiny area so that, again, they are very intense. I don’t know why this hasn’t been done yet!
That’s the Encyclopaedia Brittanica on the head of a pin, but let’s consider all the books in the world. The Library of Congress has approximately 9 million volumes; the British Museum Library has 5 million volumes; there are also 5 million volumes in the National Library in France. Undoubtedly there are duplications, so let us say that there are some 24 million volumes of interest in the world.
What would happen if I print all this down at the scale we have been discussing? How much space would it take? It would take, of course, the area of about a million pinheads because, instead of there being just the 24 volumes of the Encyclopaedia, there are 24 million volumes. The million pinheads can be put in a square of a thousand pins on a side, or an area of about 3 square yards. That is to say, the silica replica with the paper-thin backing of plastic, with which we have made the copies, with all this information, is on an area of approximately the size of 35 pages of the Encyclopaedia. That is about half as many pages as there are in this magazine. All of the information which all of mankind has ever recorded in books can be carried around in a pamphlet in your hand and not written in code, but as a simple reproduction of the original pictures, engravings, and everything else on a small scale without loss of resolution.
What would our librarian at Caltech say, as she runs all over from one building to another, if I tell her that, ten years from now, all of the information that she is struggling to keep track of 120,000 volumes, stacked from the floor to the ceiling, drawers full of cards, storage rooms full of the older books can be kept on just one library card! When the University of Brazil, for example, finds that their library is burned, we can send them a copy of every book in our library by striking off a copy from the master plate in a few hours and mailing it in an envelope no bigger or heavier than any other ordinary air mail letter.
Now, the name of this talk is “There is Plenty of Room at the Bottom” not just “There is Room at the Bottom.” What I have demonstrated is that there is room that you can decrease the size of things in a practical way. I now want to show that there is plenty of room. I will not now discuss how we are going to do it, but only what is possible in principle in other words, what is possible according to the laws of physics. I am not inventing anti-gravity, which is possible someday only if the laws are not what we think. I am telling you what could be done if the laws are what we think; we are not doing it simply because we haven’t yet gotten around to it.
Let us represent a dot by a small spot of one metal, the next dash by an adjacent spot of another metal, and so on. Suppose, to be conservative, that a bit of information is going to require a little cube of atoms 5 x 5 x 5 that is 125 atoms. Perhaps we need a hundred and some odd atoms to make sure that the information is not lost through diffusion, or through some other process.
I have estimated how many letters there are in the Encyclopaedia, and I have assumed that each of my 24 million books is as big as an Encyclopaedia volume, and have calculated, then, how many bits of information there are (1015). For each bit I allow 100 atoms. And it turns out that all of the information that man has carefully accumulated in all the books in the world can be written in this form in a cube of material one two-hundredth of an inch wide which is the barest piece of dust that can be made out by the human eye. So there is plenty of room at the bottom! Don’t tell me about microfilm!
This fact that enormous amounts of information can be carried in an exceedingly small space is, of course, well known to the biologists, and resolves the mystery which existed before we understood all this clearly, of how it could be that, in the tiniest cell, all of the information for the organization of a complex creature such as ourselves can be stored. All this information whether we have brown eyes, or whether we think at all, or that in the embryo the jawbone should first develop with a little hole in the side so that later a nerve can grow through it all this information is contained in a very tiny fraction of the cell in the form of long-chain DNA molecules in which approximately 50 atoms are used for one bit of information about the cell.
We have friends in other fields in biology, for instance. We physicists often look at them and say, “You know the reason you fellows are making so little progress?” (Actually I don’t know any field where they are making more rapid progress than they are in biology today.) “You should use more mathematics, like we do.” They could answer us but they’re polite, so I’ll answer for them: “What you should do in order for us to make more rapid progress is to make the electron microscope 100 times better.”
What are the most central and fundamental problems of biology today? They are questions like: What is the sequence of bases in the DNA? What happens when you have a mutation? How is the base order in the DNA connected to the order of amino acids in the protein? What is the structure of the RNA; is it single-chain or double-chain, and how is it related in its order of bases to the DNA? What is the organization of the microsomes? How are proteins synthesized? Where does the RNA go? How does it sit? Where do the proteins sit? Where do the amino acids go in? In photosynthesis, where is the chlorophyll; how is it arranged; where are the carotenoids involved in this thing? What is the system of the conversion of light into chemical energy?
It is very easy to answer many of these fundamental biological questions; you just look at the thing! You will see the order of bases in the chain; you will see the structure of the microsome. Unfortunately, the present microscope sees at a scale which is just a bit too crude. Make the microscope one hundred times more powerful, and many problems of biology would be made very much easier. I exaggerate, of course, but the biologists would surely be very thankful to you and they would prefer that to the criticism that they should use more mathematics.
The theory of chemical processes today is based on theoretical physics. In this sense, physics supplies the foundation of chemistry. But chemistry also has analysis. If you have a strange substance and you want to know what it is, you go through a long and complicated process of chemical analysis. You can analyze almost anything today, so I am a little late with my idea. But if the physicists wanted to, they could also dig under the chemists in the problem of chemical analysis. It would be very easy to make an analysis of any complicated chemical substance; all one would have to do would be to look at it and see where the atoms are. The only trouble is that the electron microscope is one hundred times too poor. (Later, I would like to ask the question: Can the physicists do something about the third problem of chemistry namely, synthesis? Is there a physical way to synthesize any chemical substance?
The reason the electron microscope is so poor is that the f- value of the lenses is only 1 part to 1,000; you don’t have a big enough numerical aperture. And I know that there are theorems which prove that it is impossible, with axially symmetrical stationary field lenses, to produce an f-value any bigger than so and so; and therefore the resolving power at the present time is at its theoretical maximum. But in every theorem there are assumptions. Why must the field be axially symmetrical? Why must the field be stationary? Can’t we have pulsed electron beams in fields moving up along with the electrons? Must the field be symmetrical? I put this out as a challenge: Is there no way to make the electron microscope more powerful?
There may even be an economic point to this business of making things very small. Let me remind you of some of the problems of computing machines. In computers we have to store an enormous amount of information. The kind of writing that I was mentioning before, in which I had everything down as a distribution of metal, is permanent. Much more interesting to a computer is a way of writing, erasing, and writing something else. (This is usually because we don’t want to waste the material on which we have just written. Yet if we could write it in a very small space, it wouldn’t make any difference; it could just be thrown away after it was read. It doesn’t cost very much for the material).
If I look at your face I immediately recognize that I have seen it before. (Actually, my friends will say I have chosen an unfortunate example here for the subject of this illustration. At least I recognize that it is a man and not an apple.) Yet there is no machine which, with that speed, can take a picture of a face and say even that it is a man; and much less that it is the same man that you showed it before unless it is exactly the same picture. If the face is changed; if I am closer to the face; if I am further from the face; if the light changes I recognize it anyway. Now, this little computer I carry in my head is easily able to do that. The computers that we build are not able to do that. The number of elements in this bone box of mine are enormously greater than the number of elements in our “wonderful” computers. But our mechanical computers are too big; the elements in this box are microscopic. I want to make some that are sub-microscopic.
If we wanted to make a computer that had all these marvelous extra qualitative abilities, we would have to make it, perhaps, the size of the Pentagon. This has several disadvantages. First, it requires too much material; there may not be enough germanium in the world for all the transistors which would have to be put into this enormous thing. There is also the problem of heat generation and power consumption; TVA would be needed to run the computer. But an even more practical difficulty is that the computer would be limited to a certain speed. Because of its large size, there is finite time required to get the information from one place to another. The information cannot go any faster than the speed of light so, ultimately, when our computers get faster and faster and more and more elaborate, we will have to make them smaller and smaller.
But there is plenty of room to make them smaller. There is nothing that I can see in the physical laws that says the computer elements cannot be made enormously smaller than they are now. In fact, there may be certain advantages.
But I would like to discuss, just for amusement, that there are other possibilities. Why can’t we manufacture these small computers somewhat like we manufacture the big ones? Why can’t we drill holes, cut things, solder things, stamp things out, mold different shapes all at an infinitesimal level? What are the limitations as to how small a thing has to be before you can no longer mold it? How many times when you are working on something frustratingly tiny like your wife’s wrist watch, have you said to yourself, “If I could only train an ant to do this!” What I would like to suggest is the possibility of training an ant to train a mite to do this. What are the possibilities of small but movable machines? They may or may not be useful, but they surely would be fun to make.
Consider any machine for example, an automobile and ask about the problems of making an infinitesimal machine like it. Suppose, in the particular design of the automobile, we need a certain precision of the parts; we need an accuracy, let’s suppose, of 4/10,000 of an inch. If things are more inaccurate than that in the shape of the cylinder and so on, it isn’t going to work very well. If I make the thing too small, I have to worry about the size of the atoms; I can’t make a circle out of “balls” so to speak, if the circle is too small. So, if I make the error, corresponding to 4/10,000 of an inch, correspond to an error of 10 atoms, it turns out that I can reduce the dimensions of an automobile 4,000 times, approximately so that it is 1 mm. across. Obviously, if you redesign the car so that it would work with a much larger tolerance, which is not at all impossible, then you could make a much smaller device.
It is interesting to consider what the problems are in such small machines. Firstly, with parts stressed to the same degree, the forces go as the area you are reducing, so that things like weight and inertia are of relatively no importance. The strength of material, in other words, is very much greater in proportion. The stresses and expansion of the flywheel from centrifugal force, for example, would be the same proportion only if the rotational speed is increased in the same proportion as we decrease the size. On the other hand, the metals that we use have a grain structure, and this would be very annoying at small scale because the material is not homogeneous. Plastics and glass and things of this amorphous nature are very much more homogeneous, and so we would have to make our machines out of such materials.
There are problems associated with the electrical part of the system with the copper wires and the magnetic parts. The magnetic properties on a very small scale are not the same as on a large scale; there is the “domain” problem involved. A big magnet made of millions of domains can only be made on a small scale with one domain. The electrical equipment won’t simply be scaled down; it has to be redesigned. But I can see no reason why it can’t be redesigned to work again.
This rapid heat loss would prevent the gasoline from exploding, so an internal combustion engine is impossible. Other chemical reactions, liberating energy when cold, can be used. Probably an external supply of electrical power would be most convenient for such small machines.
What would be the utility of such machines? Who knows? Of course, a small automobile would only be useful for the mites to drive around in, and I suppose our Christian interests don’t go that far. However, we did note the possibility of the manufacture of small elements for computers in completely automatic factories, containing lathes and other machine tools at the very small level. The small lathe would not have to be exactly like our big lathe. I leave to your imagination the improvement of the design to take full advantage of the properties of things on a small scale, and in such a way that the fully automatic aspect would be easiest to manage.
A friend of mine (Albert R. Hibbs) suggests a very interesting possibility for relatively small machines. He says that, although it is a very wild idea, it would be interesting in surgery if you could swallow the surgeon. You put the mechanical surgeon inside the blood vessel and it goes into the heart and “looks” around. (Of course the information has to be fed out.) It finds out which valve is the faulty one and takes a little knife and slices it out. Other small machines might be permanently incorporated in the body to assist some inadequately-functioning organ.
Now comes the interesting question: How do we make such a tiny mechanism? I leave that to you. However, let me suggest one weird possibility. You know, in the atomic energy plants they have materials and machines that they can’t handle directly because they have become radioactive. To unscrew nuts and put on bolts and so on, they have a set of master and slave hands, so that by operating a set of levers here, you control the “hands” there, and can turn them this way and that so you can handle things quite nicely.
Most of these devices are actually made rather simply, in that there is a particular cable, like a marionette string, that goes directly from the controls to the “hands.” But, of course, things also have been made using servo motors, so that the connection between the one thing and the other is electrical rather than mechanical. When you turn the levers, they turn a servo motor, and it changes the electrical currents in the wires, which repositions a motor at the other end.
Now, I want to build much the same device a master-slave system which operates electrically. But I want the slaves to be made especially carefully by modern large-scale machinists so that they are one-fourth the scale of the “hands” that you ordinarily maneuver. So you have a scheme by which you can do things at one- quarter scale anyway the little servo motors with little hands play with little nuts and bolts; they drill little holes; they are four times smaller. Aha! So I manufacture a quarter-size lathe; I manufacture quarter-size tools; and I make, at the one-quarter scale, still another set of hands again relatively one-quarter size! This is one-sixteenth size, from my point of view. And after I finish doing this I wire directly from my large-scale system, through transformers perhaps, to the one-sixteenth-size servo motors. Thus I can now manipulate the one-sixteenth size hands.
Well, you get the principle from there on. It is rather a difficult program, but it is a possibility. You might say that one can go much farther in one step than from one to four. Of course, this has all to be designed very carefully and it is not necessary simply to make it like hands. If you thought of it very carefully, you could probably arrive at a much better system for doing such things.
If you work through a pantograph, even today, you can get much more than a factor of four in even one step. But you can’t work directly through a pantograph which makes a smaller pantograph which then makes a smaller pantograph because of the looseness of the holes and the irregularities of construction. The end of the pantograph wiggles with a relatively greater irregularity than the irregularity with which you move your hands. In going down this scale, I would find the end of the pantograph on the end of the pantograph on the end of the pantograph shaking so badly that it wasn’t doing anything sensible at all.
At each stage, it is necessary to improve the precision of the apparatus. If, for instance, having made a small lathe with a pantograph, we find its lead screw irregular more irregular than the large-scale one we could lap the lead screw against breakable nuts that you can reverse in the usual way back and forth until this lead screw is, at its scale, as accurate as our original lead screws, at our scale.
We can make flats by rubbing unflat surfaces in triplicates together in three pairs and the flats then become flatter than the thing you started with. Thus, it is not impossible to improve precision on a small scale by the correct operations. So, when we build this stuff, it is necessary at each step to improve the accuracy of the equipment by working for awhile down there, making accurate lead screws, Johansen blocks, and all the other materials which we use in accurate machine work at the higher level. We have to stop at each level and manufacture all the stuff to go to the next level a very long and very difficult program. Perhaps you can figure a better way than that to get down to small scale more rapidly.
Yet, after all this, you have just got one little baby lathe four thousand times smaller than usual. But we were thinking of making an enormous computer, which we were going to build by drilling holes on this lathe to make little washers for the computer. How many washers can you manufacture on this one lathe?
Where am I going to put the million lathes that I am going to have? Why, there is nothing to it; the volume is much less than that of even one full-scale lathe. For instance, if I made a billion little lathes, each 1/4000 of the scale of a regular lathe, there are plenty of materials and space available because in the billion little ones there is less than 2 percent of the materials in one big lathe.
It doesn’t cost anything for materials, you see. So I want to build a billion tiny factories, models of each other, which are manufacturing simultaneously, drilling holes, stamping parts, and so on.
As we go down in size, there are a number of interesting problems that arise. All things do not simply scale down in proportion. There is the problem that materials stick together by the molecular (Van der Waals) attractions. It would be like this: After you have made a part and you unscrew the nut from a bolt, it isn’t going to fall down because the gravity isn’t appreciable; it would even be hard to get it off the bolt. It would be like those old movies of a man with his hands full of molasses, trying to get rid of a glass of water. There will be several problems of this nature that we will have to be ready to design for.
Up to now, we have been content to dig in the ground to find minerals. We heat them and we do things on a large scale with them, and we hope to get a pure substance with just so much impurity, and so on. But we must always accept some atomic arrangement that nature gives us. We haven’t got anything, say, with a “checkerboard” arrangement, with the impurity atoms exactly arranged 1,000 angstroms apart, or in some other particular pattern.
What could we do with layered structures with just the right layers? What would the properties of materials be if we could really arrange the atoms the way we want them? They would be very interesting to investigate theoretically. I can’t see exactly what would happen, but I can hardly doubt that when we have some control of the arrangement of things on a small scale we will get an enormously greater range of possible properties that substances can have, and of different things that we can do.
Consider, for example, a piece of material in which we make little coils and condensers (or their solid state analogs) 1,000 or 10,000 angstroms in a circuit, one right next to the other, over a large area, with little antennas sticking out at the other end a whole series of circuits. Is it possible, for example, to emit light from a whole set of antennas, like we emit radio waves from an organized set of antennas to beam the radio programs to Europe? The same thing would be to beam the light out in a definite direction with very high intensity. (Perhaps such a beam is not very useful technically or economically.)
I have thought about some of the problems of building electric circuits on a small scale, and the problem of resistance is serious. If you build a corresponding circuit on a small scale, its natural frequency goes up, since the wave length goes down as the scale; but the skin depth only decreases with the square root of the scale ratio, and so resistive problems are of increasing difficulty. Possibly we can beat resistance through the use of superconductivity if the frequency is not too high, or by other tricks.
Another thing we will notice is that, if we go down far enough, all of our devices can be mass produced so that they are absolutely perfect copies of one another. We cannot build two large machines so that the dimensions are exactly the same. But if your machine is only 100 atoms high, you only have to get it correct to one-half of one percent to make sure the other machine is exactly the same size namely, 100 atoms high!
At the atomic level, we have new kinds of forces and new kinds of possibilities, new kinds of effects. The problems of manufacture and reproduction of materials will be quite different. I am, as I said, inspired by the biological phenomena in which chemical forces are used in a repetitious fashion to produce all kinds of weird effects (one of which is the author).
The principles of physics, as far as I can see, do not speak against the possibility of maneuvering things atom by atom. It is not an attempt to violate any laws; it is something, in principle, that can be done; but in practice, it has not been done because we are too big.
Ultimately, we can do chemical synthesis. A chemist comes to us and says, “Look, I want a molecule that has the atoms arranged thus and so; make me that molecule.” The chemist does a mysterious thing when he wants to make a molecule. He sees that it has got that ring, so he mixes this and that, and he shakes it, and he fiddles around. And, at the end of a difficult process, he usually does succeed in synthesizing what he wants. By the time I get my devices working, so that we can do it by physics, he will have figured out how to synthesize absolutely anything, so that this will really be useless.
But it is interesting that it would be, in principle, possible (I think) for a physicist to synthesize any chemical substance that the chemist writes down. Give the orders and the physicist synthesizes it. How? Put the atoms down where the chemist says, and so you make the substance. The problems of chemistry and biology can be greatly helped if our ability to see what we are doing, and to do things on an atomic level, is ultimately developed a development which I think cannot be avoided.
Now, you might say, “Who should do this and why should they do it?” Well, I pointed out a few of the economic applications, but I know that the reason that you would do it might be just for fun. But have some fun! Let’s have a competition between laboratories. Let one laboratory make a tiny motor which it sends to another lab which sends it back with a thing that fits inside the shaft of the first motor.
Perhaps this doesn’t excite you to do it, and only economics will do so. Then I want to do something; but I can’t do it at the present moment, because I haven’t prepared the ground. It is my intention to offer a prize of $1,000 to the first guy who can take the information on the page of a book and put it on an area 1/25,000 smaller in linear scale in such manner that it can be read by an electron microscope.
And I want to offer another prize if I can figure out how to phrase it so that I don’t get into a mess of arguments about definitions of another $1,000 to the first guy who makes an operating electric motor a rotating electric motor which can be controlled from the outside and, not counting the lead-in wires, is only 1/64 inch cube.
I do not expect that such prizes will have to wait very long for claimants.
Posted: July 3, 2016 at 12:14 pm
This one has been a little more slow and complex to develop than expected, but after a long 3 months (4 really, but one was taken up with our first holiday in 4 years!) B1414 is now live for everyone. This update brings more complete car design aspects, along with car designer scenarios and a much improved user interface. There are also quite a few more new car bodies to base your designs from.
The next few months will be dedicated to some fairly unexciting but very important work, ready for release on Steam. We’ll be making major improvements aimed at making Automation more polished, easier to learn, and all around more professional looking. More improvements to UI and the process of model designing will make it much more logical and simple to design a big range of models based on one particular car. Pop-ups/Tooltips will be added for many thing, and tutorial videos will be done/redone covering every techincal part of the game. We’ll also be aiming to add a bunch of improved multiplayer modes, including Lap Time and Rally Stage Time challenge modes, with the ability to set your scores over the a few days, so you don’t need to all be online together to compete.This update will be the first one to release on Steam, which is an exciting milestone, and will hopefully bring in more sales allowing us to bring further people on to work on car body art and other new content, And after this update is out of the way, it’s finally time to start work on the Tycoon aspect of things!
Car Designer Features & Changes Added Mid Engine Cars Suspension Easy Mode Added Quality Sliders and dependencies for all car designer tabs Adjustable Rim Offsets Added Multilink Suspension Added the first automatic gearboxes Reliability and Environmental Resistance stats Passenger Space and Cargo Space stats Production Units, Costs and Service Costs stats Offroad and Utility stats Rebalanced Sportiness, Tameness, Comfort, Prestige and Safety calculations 9 Car Designer Scenarios Rebalanced material properties Many new part year dependencies Limited Cars to a Maximum of 2 wings, and 2 lips Added tire profile year limitation Base safety will stop progressing 10 years after a body first unlocks Bodies sorted by Year. Newest at the top Revised the Bottoming Out calculations to be less harsh Wings/Lips no longer punch holes in the body shell
Car Designer Fixes Fixed the crash caused by using MPH + dragging the top speed slider to top for high-revving engines (finally!) Fixed the Yaw Rate graph cut off when using mph as a unit for speed Fixed Certain cars not being able to complete a lap Fixed the proper gear delay being used on the test track Fixed Front Longitudinal AWD engine placement issues Fixed the sensitivity of resizing various fixtures, making it more responsive Fixed steamroller bug where wheels would become comically wide New Car Bodies Large 60s Coup 2 Large 70’s Coups Large 90’s Coup Large 60s Sedan Large 00’s Sedan Small 80s Supercar Small 10s Supercar Large 10s Supercar
UI & Sound Completely reworked UI and UI flow Car Design Wizard for the whole car design process All new UI sounds Ambient sounds Added test track soundsNew Car & Engine Manager Temporary changed the Platform/Model game mechanic Many more stats on the three different testing pages Updated graphs Updated test track UI Manual start for car testing on testing page Engine Designer Fixes / Rebalances Reduced power gain when riching up fuel mixture Octane rating in VVL systems uses the lower cam setting Added bypass valve year limitation Fixed the your engine was created in a previous version message bug Fixed bore and stroke having two decimals too few using imperial units Fixed a bug where loading a VVL engine set the wrong lower cam setting Fixed an engine loading bug that caused the block config lua error
General Things Changed MTBF to Reliability for less confusion New scenario scoring system implemented for car designer scenarios Fixed various aerodynamics calculations and exploits Changed all Man Hours to Production Units Added Console can be accessed by pressing tilde (~). Commands are help(), HideBuildings(), ShowBuildings(). Changed to saving screenshots as PNG. If you turn off FXAA, and use the HideBuildings() command, you can take pictures of engines/cars on a transparent back-drop. Useful for taking screenshots of Engines and Cars with no backdrops. Fixed the tutorial video sound cutting off after a minute Thumbnails are now deleted when you delete the model / engine it belongs to Many more little fixes
Read this article:
Posted: June 12, 2016 at 8:25 pm
Biological warfare (BW)also known as germ warfareis the use of biological toxins or infectious agents such as bacteria, viruses, and fungi with the intent to kill or incapacitate humans, animals or plants as an act of war. Biological weapons (often termed “bio-weapons”, “biological threat agents”, or “bio-agents”) are living organisms or replicating entities (viruses, which are not universally considered “alive”) that reproduce or replicate within their host victims. Entomological (insect) warfare is also considered a type of biological weapon. This type of warfare is distinct from nuclear warfare and chemical warfare, which together with biological warfare make up NBC, the military acronym for nuclear, biological, and chemical warfare using weapons of mass destruction (WMDs). None of these are conventional weapons, which are primarily due to their explosive, kinetic, or incendiary potential.
Biological weapons may be employed in various ways to gain a strategic or tactical advantage over the enemy, either by threats or by actual deployments. Like some of the chemical weapons, biological weapons may also be useful as area denial weapons. These agents may be lethal or non-lethal, and may be targeted against a single individual, a group of people, or even an entire population. They may be developed, acquired, stockpiled or deployed by nation states or by non-national groups. In the latter case, or if a nation-state uses it clandestinely, it may also be considered bioterrorism.
There is an overlap between biological warfare and chemical warfare, as the use of toxins produced by living organisms is considered under the provisions of both the Biological Weapons Convention and the Chemical Weapons Convention. Toxins and psychochemical weapons are often referred to as midspectrum agents. Unlike bioweapons, these midspectrum agents do not reproduce in their host and are typically characterized by shorter incubation periods.
Offensive biological warfare, including mass production, stockpiling and use of biological weapons, was outlawed by the 1972 Biological Weapons Convention (BWC). The rationale behind this treaty, which has been ratified or acceded to by 170 countries as of April 2013, is to prevent a biological attack which could conceivably result in large numbers of civilian casualties and cause severe disruption to economic and societal infrastructure. Many countries, including signatories of the BWC, currently pursue research into the defense or protection against BW, which is not prohibited by the BWC.
A nation or group that can pose a credible threat of mass casualty has the ability to alter the terms on which other nations or groups interact with it. Biological weapons allow for the potential to create a level of destruction and loss of life far in excess of nuclear, chemical or conventional weapons, relative to their mass and cost of development and storage. Therefore, biological agents may be useful as strategic deterrents in addition to their utility as offensive weapons on the battlefield.
As a tactical weapon for military use, a significant problem with a BW attack is that it would take days to be effective, and therefore might not immediately stop an opposing force. Some biological agents (smallpox, pneumonic plague) have the capability of person-to-person transmission via aerosolized respiratory droplets. This feature can be undesirable, as the agent(s) may be transmitted by this mechanism to unintended populations, including neutral or even friendly forces. While containment of BW is less of a concern for certain criminal or terrorist organizations, it remains a significant concern for the military and civilian populations of virtually all nations.
Rudimentary forms of biological warfare have been practiced since antiquity. During the 6th century BC, the Assyrians poisoned enemy wells with a fungus that would render the enemy delirious. In 1346, the bodies of Mongol warriors of the Golden Horde who had died of plague were thrown over the walls of the besieged Crimean city of Kaffa. Specialists disagree over whether this operation may have been responsible for the spread of the Black Death into Europe.
It has been claimed that the British Marines used smallpox in New South Wales in 1789. Historians have long debated inconclusively whether the British Army used smallpox in an episode against Native Americans in 1763.
By 1900 the germ theory and advances in bacteriology brought a new level of sophistication to the techniques for possible use of bio-agents in war. Biological sabotagein the form of anthrax and glanderswas undertaken on behalf of the Imperial German government during World War I (19141918), with indifferent results. The Geneva Protocol of 1925 prohibited the use of chemical weapons and biological weapons.
With the onset of World War II, the Ministry of Supply in the United Kingdom established a BW program at Porton Down, headed by the microbiologist Paul Fildes. The research was championed by Winston Churchill and soon tularemia, anthrax, brucellosis, and botulism toxins had been effectively weaponized. In particular, Gruinard Island in Scotland, during a series of extensive tests was contaminated with anthrax for the next 56 years. Although the UK never offensively used the biological weapons it developed on its own, its program was the first to successfully weaponize a variety of deadly pathogens and bring them into industrial production.
When the USA entered the war, mounting British pressure for the creation of a similar research program for an Allied pooling of resources, led to the creation of a large industrial complex at Fort Detrick, Maryland in 1942 under the direction of George W. Merck. The biological and chemical weapons developed during that period were tested at the Dugway Proving Grounds in Utah. Soon there were facilities for the mass production of anthrax spores, brucellosis, and botulism toxins, although the war was over before these weapons could be of much operational use.
The most notorious program of the period was run by the secret Imperial Japanese Army Unit 731 during the war, based at Pingfan in Manchuria and commanded by Lieutenant General Shir Ishii. This unit did research on BW, conducted often fatal human experiments on prisoners, and produced biological weapons for combat use. Although the Japanese effort lacked the technological sophistication of the American or British programs, it far outstripped them in its widespread application and indiscriminate brutality. Biological weapons were used against both Chinese soldiers and civilians in several military campaigns. In 1940, the Japanese Army Air Force bombed Ningbo with ceramic bombs full of fleas carrying the bubonic plague. Many of these operations were ineffective due to inefficient delivery systems, although up to 400,000 people may have died. During the Zhejiang-Jiangxi Campaign in 1942, around 1,700 Japanese troops died out of a total 10,000 Japanese soldiers who fell ill with disease when their own biological weapons attack rebounded on their own forces.
During the final months of World War II, Japan planned to use plague as a biological weapon against U.S. civilians in San Diego, California, during Operation Cherry Blossoms at Night. The plan was set to launch on 22 September 1945, but it was not executed because of Japan’s surrender on 15 August 1945.
In Britain, the 1950s saw the weaponization of plague, brucellosis, tularemia and later equine encephalomyelitis and vaccinia viruses, but the programme was unilaterally cancelled in 1956. The United States Army Biological Warfare Laboratories weaponized anthrax, tularemia, brucellosis, Q-fever and others.
In 1969, the UK and the Warsaw Pact, separately, introduced proposals to the UN to ban biological weapons, and US President Richard Nixon terminated production of biological weapons, allowing only scientific research for defensive measures. The Biological and Toxin Weapons Convention was signed by the US, UK, USSR and other nations, as a ban on “development, production and stockpiling of microbes or their poisonous products except in amounts necessary for protective and peaceful research” in 1972. However, the Soviet Union continued research and production of massive offensive biological weapons in a program called Biopreparat, despite having signed the convention. By 2011, 165 countries had signed the treaty and none are proventhough nine are still suspectedto possess offensive BW programs.
It has been argued that rational state actors would never use biological weapons offensively. The argument is that biological weapons cannot be controlled: the weapon could backfire and harm the army on the offensive, perhaps having even worse effects than on the target. An agent like smallpox or other airborne viruses would almost certainly spread worldwide and ultimately infect the user’s home country. However, this argument does not necessarily apply to bacteria. For example, anthrax can easily be controlled and even created in a garden shed; the FBI suspects it can be done for as little as $2,500 using readily available laboratory equipment. Also, using microbial methods, bacteria can be suitably modified to be effective in only a narrow environmental range, the range of the target that distinctly differs from the army on the offensive. Thus only the target might be affected adversely. The weapon may be further used to bog down an advancing army making them more vulnerable to counterattack by the defending force.
Ideal characteristics of a biological agent to be used as a weapon against humans are high infectivity, high virulence, non-availability of vaccines, and availability of an effective and efficient delivery system. Stability of the weaponized agent (ability of the agent to retain its infectivity and virulence after a prolonged period of storage) may also be desirable, particularly for military applications, and the ease of creating one is often considered. Control of the spread of the agent may be another desired characteristic.
The primary difficulty is not the production of the biological agent, as many biological agents used in weapons can often be manufactured relatively quickly, cheaply and easily. Rather, it is the weaponization, storage and delivery in an effective vehicle to a vulnerable target that pose significant problems.
For example, Bacillus anthracis is considered an effective agent for several reasons. First, it forms hardy spores, perfect for dispersal aerosols. Second, this organism is not considered transmissible from person to person, and thus rarely if ever causes secondary infections. A pulmonary anthrax infection starts with ordinary influenza-like symptoms and progresses to a lethal hemorrhagic mediastinitis within 37 days, with a fatality rate that is 90% or higher in untreated patients. Finally, friendly personnel can be protected with suitable antibiotics.
A large-scale attack using anthrax would require the creation of aerosol particles of 1.5 to 5m: larger particles would not reach the lower respiratory tract, while smaller particles would be exhaled back out into the atmosphere. At this size, conductive powders tend to aggregate because of electrostatic charges, hindering dispersion. So the material must be treated to insulate and neutralize the charges. The weaponized agent must be resistant to degradation by rain and ultraviolet radiation from sunlight, while retaining the ability to efficiently infect the human lung. There are other technological difficulties as well, chiefly relating to storage of the weaponized agent.
Agents considered for weaponization, or known to be weaponized, include bacteria such as Bacillus anthracis, Brucella spp., Burkholderia mallei, Burkholderia pseudomallei, Chlamydophila psittaci, Coxiella burnetii, Francisella tularensis, some of the Rickettsiaceae (especially Rickettsia prowazekii and Rickettsia rickettsii), Shigella spp., Vibrio cholerae, and Yersinia pestis. Many viral agents have been studied and/or weaponized, including some of the Bunyaviridae (especially Rift Valley fever virus), Ebolavirus, many of the Flaviviridae (especially Japanese encephalitis virus), Machupo virus, Marburg virus, Variola virus, and Yellow fever virus. Fungal agents that have been studied include Coccidioides spp..
Toxins that can be used as weapons include ricin, staphylococcal enterotoxin B, botulinum toxin, saxitoxin, and many mycotoxins. These toxins and the organisms that produce them are sometimes referred to as select agents. In the United States, their possession, use, and transfer are regulated by the Centers for Disease Control and Prevention’s Select Agent Program.
The former US biological warfare program categorized its weaponized anti-personnel bio-agents as either Lethal Agents (Bacillus anthracis, Francisella tularensis, Botulinum toxin) or Incapacitating Agents (Brucella suis, Coxiella burnetii, Venezuelan equine encephalitis virus, Staphylococcal enterotoxin B).
The United States developed an anti-crop capability during the Cold War that used plant diseases (bioherbicides, or mycoherbicides) for destroying enemy agriculture. Biological weapons also target fisheries as well as water-based vegetation. It was believed that destruction of enemy agriculture on a strategic scale could thwart Sino-Soviet aggression in a general war. Diseases such as wheat blast and rice blast were weaponized in aerial spray tanks and cluster bombs for delivery to enemy watersheds in agricultural regions to initiate epiphytotics (epidemics among plants). When the United States renounced its offensive biological warfare program in 1969 and 1970, the vast majority of its biological arsenal was composed of these plant diseases. Enterotoxins and Mycotoxins were not affected by Nixon’s order.
Though herbicides are chemicals, they are often grouped with biological warfare and chemical warfare because they may work in a similar manner as biotoxins or bioregulators. The Army Biological Laboratory tested each agent and the Army’s Technical Escort Unit was responsible for transport of all chemical, biological, radiological (nuclear) materials. Scorched earth tactics or destroying livestock and farmland were carried out in the Vietnam war (cf. Agent Orange) and Eelam War in Sri Lanka.
Biological warfare can also specifically target plants to destroy crops or defoliate vegetation. The United States and Britain discovered plant growth regulators (i.e., herbicides) during the Second World War, and initiated a herbicidal warfare program that was eventually used in Malaya and Vietnam in counterinsurgency operations.
In 1980s Soviet Ministry of Agriculture had successfully developed variants of foot-and-mouth disease, and rinderpest against cows, African swine fever for pigs, and psittacosis to kill chicken. These agents were prepared to spray them down from tanks attached to airplanes over hundreds of miles. The secret program was code-named “Ecology”.
Attacking animals is another area of biological warfare intended to eliminate animal resources for transportation and food. In the First World War, German agents were arrested attempting to inoculate draft animals with anthrax, and they were believed to be responsible for outbreaks of glanders in horses and mules. The British tainted small feed cakes with anthrax in the Second World War as a potential means of attacking German cattle for food denial, but never employed the weapon. In the 1950s, the United States had a field trial with hog cholera. During the Mau Mau Uprising in 1952, the poisonous latex of the African milk bush was used to kill cattle.
Outside the context of war, humans have deliberately introduced the rabbit disease Myxomatosis, originating in South America, to Australia and Europe, with the intention of reducing the rabbit population which had devastating but temporary results, with wild rabbit populations reduced to a fraction of their former size but survivors developing immunity and increasing again.
Entomological warfare (EW) is a type of biological warfare that uses insects to attack the enemy. The concept has existed for centuries and research and development have continued into the modern era. EW has been used in battle by Japan and several other nations have developed and been accused of using an entomological warfare program. EW may employ insects in a direct attack or as vectors to deliver a biological agent, such as plague. Essentially, EW exists in three varieties. One type of EW involves infecting insects with a pathogen and then dispersing the insects over target areas. The insects then act as a vector, infecting any person or animal they might bite. Another type of EW is a direct insect attack against crops; the insect may not be infected with any pathogen but instead represents a threat to agriculture. The final method uses uninfected insects, such as bees, wasps, etc., to directly attack the enemy.
In 2010 at The Meeting of the States Parties to the Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and Their Destruction in Geneva the sanitary epidemiological reconnaissance was suggested as well-tested means for enhancing the monitoring of infections and parasitic agents, for practical implementation of the International Health Regulations (2005). The aim was to prevent and minimize the consequences of natural outbreaks of dangerous infectious diseases as well as the threat of alleged use of biological weapons against BTWC States Parties.
It is important to note that most classical and modern biological weapons’ pathogens can be obtained from a plant or an animal which is naturally infected.
Indeed, in the largest biological weapons accident known the anthrax outbreak in Sverdlovsk (now Yekaterinburg) in the Soviet Union in 1979, sheep became ill with anthrax as far as 200 kilometers from the release point of the organism from a military facility in the southeastern portion of the city and still off limits to visitors today, see Sverdlovsk Anthrax leak).
Thus, a robust surveillance system involving human clinicians and veterinarians may identify a bioweapons attack early in the course of an epidemic, permitting the prophylaxis of disease in the vast majority of people (and/or animals) exposed but not yet ill.
For example, in the case of anthrax, it is likely that by 2436 hours after an attack, some small percentage of individuals (those with compromised immune system or who had received a large dose of the organism due to proximity to the release point) will become ill with classical symptoms and signs (including a virtually unique chest X-ray finding, often recognized by public health officials if they receive timely reports). The incubation period for humans is estimated to be about 11.8 days to 12.1 days. This suggested period is the first model that is independently consistent with data from the largest known human outbreak. These projections refines previous estimates of the distribution of early onset cases after a release and supports a recommended 60-day course of prophylactic antibiotic treatment for individuals exposed to low doses of anthrax. By making these data available to local public health officials in real time, most models of anthrax epidemics indicate that more than 80% of an exposed population can receive antibiotic treatment before becoming symptomatic, and thus avoid the moderately high mortality of the disease.
From most specific to least specific:
1. Single cause of a certain disease caused by an uncommon agent, with lack of an epidemiological explanation.
2. Unusual, rare, genetically engineered strain of an agent.
3. High morbidity and mortality rates in regards to patients with the same or similar symptoms.
4. Unusual presentation of the disease.
5. Unusual geographic or seasonal distribution.
6. Stable endemic disease, but with an unexplained increase in relevance.
7. Rare transmission (aerosols, food, water).
8. No illness presented in people who were/are not exposed to “common ventilation systems (have separate closed ventilation systems) when illness is seen in persons in close proximity who have a common ventilation system.”
9. Different and unexplained diseases coexisting in the same patient without any other explanation.
10. Rare illness that affects a large, disparate population (respiratory disease might suggest the pathogen or agent was inhaled).
11. Illness is unusual for a certain population or age-group in which it takes presence.
12. Unusual trends of death and/or illness in animal populations, previous to or accompanying illness in humans.
13. Many effected reaching out for treatment at the same time.
14. Similar genetic makeup of agents in effected individuals.
15. Simultaneous collections of similar illness in non-contiguous areas, domestic, or foreign.
16. An abundance of cases of unexplained diseases and deaths.
The goal of biodefense is to integrate the sustained efforts of the national and homeland security, medical, public health, intelligence, diplomatic, and law enforcement communities. Health care providers and public health officers are among the first lines of defense. In some countries private, local, and provincial (state) capabilities are being augmented by and coordinated with federal assets, to provide layered defenses against biological weapon attacks. During the first Gulf War the United Nations activated a biological and chemical response team, Task Force Scorpio, to respond to any potential use of weapons of mass destruction on civilians.
The traditional approach toward protecting agriculture, food, and water: focusing on the natural or unintentional introduction of a disease is being strengthened by focused efforts to address current and anticipated future biological weapons threats that may be deliberate, multiple, and repetitive.
The growing threat of biowarfare agents and bioterrorism has led to the development of specific field tools that perform on-the-spot analysis and identification of encountered suspect materials. One such technology, being developed by researchers from the Lawrence Livermore National Laboratory (LLNL), employs a “sandwich immunoassay”, in which fluorescent dye-labeled antibodies aimed at specific pathogens are attached to silver and gold nanowires.
In the Netherlands, the company TNO has designed Bioaerosol Single Particle Recognition eQuipment (BiosparQ). This system would be implemented into the national response plan for bioweapon attacks in the Netherlands.
Researchers at Ben Gurion University in Israel are developing a different device called the BioPen, essentially a “Lab-in-a-Pen”, which can detect known biological agents in under 20 minutes using an adaptation of the ELISA, a similar widely employed immunological technique, that in this case incorporates fiber optics.
Theoretically, novel approaches in biotechnology, such as synthetic biology could be used in the future to design novel types of biological warfare agents. Special attention has to be laid on future experiments (of concern) that:
Most of the biosecurity concerns in synthetic biology, however, are focused on the role of DNA synthesis and the risk of producing genetic material of lethal viruses (e.g. 1918 Spanish flu, polio) in the lab. Recently, the CRISPR/Cas system has emerged as a promising technique for gene editing. It was hailed by The Washington Post as “the most important innovation in the synthetic biology space in nearly 30 years.” While other methods take months or years to edit gene sequences, CRISPR speeds that time up to weeks. However, due to its ease of use and accessibility, it has raised a number of ethical concerns, especially surrounding its use in the biohacking space.
Read this article:
Posted: October 4, 2015 at 4:43 pm
Dutch National Police Set Sights on Blockchain-Based Cloud Services View more How the Hunt for Satoshi Turned Dorian Nakamotos Life Upside Down: the Inside Story View more Bitcoin Used to Pay Utility and Credit Card Bills in the Philippines and Australia View more Building a Risk Market for the Digital Age Using Bitcoin View more WEF Survey Predicts Bitcoin ‘Tipping Point’ Happening By 2027 View more Australian Startups Close Down as Banks End Support for Bitcoin View more Beyond Bitcoin: How the Blockchain Can Power a New Generation of Enterprise Software View more R3 Blockchain Development Initiative Grows to 22 Banks Worldwide View more The Decentralist Perspective, or Why Bitcoin Might Need Small Blocks View more Everything You Need to Know about the Proposed Changes to the Bitcoin Block Size Cap View more Gavin Andresen on the Block Size: Its Hard to Find a Signal above All the Noise View more Bitcoin XT Big Block Fork Spurs Debate and Controversy View more Transaction Remote Release, a Tor-Inspired Proposed Change to Bitcoin Protocol for Anonymous Transactions View more Vaultoro Seeks to Provide a Store of Value to the Underbanked World Using Gold and Bitcoin View more Bank of America Files Patent Application for Cryptocurrency-Mediated Wire Transfers View more Bitcoin Tracker One ETN Offers Liquidity to European Investors View more NeuCoin Launches a New Digital Currency for Online and Mobile Gaming View more Digital Currency Derivatives Exchanges Prepare for Regulation after CFTC Bitcoin Ruling View more Nine Top Global Banks Pool Resources to Fund R3 to Develop Digital Currency Standards View more UBS to Develop Yet Another Permissioned Blockchain for Banks View more Blythe Masters and Wall Street Opt for Permissioned Non-Bitcoin Blockchains View more Nick Szabo on Permissioned Blockchains and the Block Size View more Bitcoin Derivatives Company LedgerX Appoints Ex-CFTC Commissioner Wetjen to Board View more Bitcoin Hardware Wallet KeepKey Launches and Begins Shipping View more Three Bitcoin Finalists Vie for BBVA Open Talent Competition Honors in Barcelona View more Australian Regulators Investigating Banks for Closing Accounts of Bitcoin Companies View more Bank of England Chief Economist: Blockchain-based Digital Currency Issued by Central Banks Could Replace Cash View more Sig3 Launches an Automated Policy-Based Transaction Cosigner for Multisig Bitcoin Transactions View more BitGo Processes Over $1 Billion in Bitcoin Transactions in Third Quarter View more Storj Network Passes 1 Petabyte Storage Space View more Bitcoin and Gold Exchange Vaultoro Reaches $1 Million in Gold Trading Volume View more
Read more from the original source:
Bitcoin Magazine | Bitcoin and Cryptocurrency News
Posted: August 30, 2015 at 7:46 pm
Earlier last week n8fr8 suspected something changed on the ostel.co server, due to many users emailing support specifically about Jitsi connectivity to ostel.co. The common question was why did it work a few weeks ago and now it doesnt anymore?
The tl;dr follows, skip to keyword CONCLUSION to hear only the punch line.
To support n8fr8s hypothesis, there was a small change to the server but I want convinced it effected anything since all my clients continued to work properly, including Jitsi. Obviously something had changed but none of us knew what it was. After some testing we discovered the problem was related to insecure connections from Jitsi to UDP port 5060 on ostel.co. Secure connections (on TCP port 5061) continued to work as expected.
To make matters more confusing, I could register and make calls with two different clients (CSipSimple and Linphone) on the same network (my home ISP, Verizon FiOS) using an insecure connection to ostel.co on UDP port 5060.
At this point I was like WTF?
I went back to the server, diffed all the configs, checked server versions, connected with every client I could find that would run on any of my computers. The only change was a Kamailio upgrade from 4.0.1 to 4.0.2. A minor point release. The problem with Jitsi remained. What could the server be doing to this poor client?
I did a packet trace on the ostel.co servers public network interface, filtered to dump packets only on UDP port 5060 that match my SIP username. I opened Jitsi and things got interesting. For the curious, heres the utility and options I used. If you are new to operating a SIP network, ngrep is an excellent tool for debugging.
ngrep -d eth0 -t -p -W byline foo port 5060
Ill include an excerpt (Ive included only the relevant headers for this issue) of the initial request from Jitsi. IP addresses and usernames have been changed to protect the innocent.
U 2013/07/19 22:17:34.920749 0.0.0.0:5060 -> 184.108.40.206:5060 REGISTER sip:ostel.co SIP/2.0. CSeq: 1 REGISTER. From: “foo”
# U 2013/07/19 22:17:34.921155 220.127.116.11:5060 -> 0.0.0.0:5060 SIP/2.0 401 Unauthorized. CSeq: 1 REGISTER. From: foo
If you read the response, youll see Kamailio sent 401 Unauthorized. This is normal for SIP authentication. A second client request should follow it, which should contain an Authorization header with an md5 and a nonce. When Kamailio receives this request, checks the auth database and sends a 200 OK response, the client is authenticated.
The SIP dialog looks good but Jitsi continues not to register. The dialog flow is cut off after the 401 Unauthorized response. Its almost like something has blocked the response to the client.
Since I could register Linphone using the same account, I did the same trace for that client. Heres the excerpt.
U 2013/07/19 22:33:18.372770 0.0.0.0:42680 -> 18.104.22.168:5060 REGISTER sip:ostel.co SIP/2.0. Via: SIP/2.0/UDP 0.0.0.0:49153;rport;branch=z9hG4bK359459505. From:
# U 2013/07/19 22:33:18.373112 22.214.171.124:5060 -> 0.0.0.0:42680 SIP/2.0 401 Unauthorized. Via: SIP/2.0/UDP 0.0.0.0:49153;rport=42680;branch=z9hG4bK359459505. From:
This 401 Unauthorized response was received by the client and the follow up request with the Authorization header was sent with the correct digest. Linphone registered. I made a call. Everything worked fine. Indeed WTF?
I stared at these traces for a while to get a clue. Look again at the first line of the request from Jitsi. Youll see a timestamp followed by two IP:port pairs. Notice the port on the first IP is 5060 and the port on the second IP is also 5060. This means that the source port used by Jitsi on my home network is UDP port 5060. In order for a response to come back to Jitsi, it must enter my network on the same port it exited. Now read the top line of the response from Kamailio. Indeed, the server sent the response to UDP port 5060.
Now look at the same flow for Linphone. There is a very different source port in that dialog. In this case, Kamailio sent the response to UDP port 42680 and Linphone received it. Also notice the IP address used by Kamailio as the destination of the response is the same one in the dialog from Jitsi.
The question remained, why cant Jitsi get the same kind of SIP response on UDP port 5060? Why is Jitsi using a single source port for outgoing traffic anyway? That value can be dynamic. I configured Jitsi to use a different port for insecure SIP. It has an advanced configuration for SIP with the key SIP client port. I set this to 5062 (5061 is conventionally used for secure SIP traffic so I incremented by 2) and tried to register again.
To be thorough, I changed Jitsis SIP port again to a 5 digit number I randomly typed on my keyboard without looking.
So if Jitsi can register to Kamailio on any port other than UDP port 5060, WTF is going on? I had a suspicion. I tried one more test before I called it. I configured Jitsi to connect on TCP port 5060. It registered successfully. Now I know whats going on. I have a sad
My ISP, Verizon FiOS, has a firewall running somewhere upstream (it could be on the router they provided, I havent checked yet) that blocks incoming UDP traffic to port 5060. This probably falls under their TOS section which forbids running servers since Verizon provides voice services for an additional fee on top of data service, despite both running over the same fiber connection to my house. It seems like Verizon doesnt want their data-only customers to get in the way of that sweet cheddar delivery each month in exchange for phone service.
This sucks on two levels.
Why is my ISP censoring my incoming traffic when I have 5 mbps of incoming bandwidth? I assume the answer is because they can. *desolate frowny face*
Why doesnt Jitsi use a dynamic source port for SIP requests? I assume the answer is Jitsi is open source, why dont I change this and send a patch upstream?
Both levels are formidable challenges to overcome. Convincing Verizon to play nice on the Internet feels like a vanity project. Im writing that off. To make a change to the SIP stack in Jitsi is well within the area of the GP teams expertise, myself included but its not a trivial undertaking. Since this is a default configuration change there is probably a reason upstream devs made this choice so in addition to the programming work theres the work to convince the developers this would be a change worth a new release.
Since this is specific to Jitsi, Im going to follow up with the developers and see if I missed anything. Stay tuned for part two.
Thanks for listening. Stay safe!
Read the original:
Jitsi, ostel.co and ISP censorship | The Guardian Project
Posted: February 25, 2015 at 12:40 am
Orlando, FLA (PRWEB) February 24, 2015
Yogi Berra once famously said, The future aint what it used to be. And he was right. In fact, according to trend expert and keynote speaker Jack Uldrich, the future “is going to be downright unusual.” This begs the obvious question: How do organizations prepare for an uncertain and unpredictable future? The answer, says Uldrich,” is that leaders and their organizations must think and act in unorthodox ways.”
Uldrich, who delivered a keynote to executives of the Retail Industry Leaders Association (RILA) at the “All Channels. All Challenges. One Conference” last April, will address the group again today, February 24th. He will deliver his keynote: “Business as Unusual: How Future Trends Will Transform the Supply Chain of Tomorrow.” (Some of Uldrich’s other clients in retail and supply management include the Women’s Food Forum, TRUNO, the Food Marketing Institute, GameStop’s Executive Summit, Utility Supply Management Association, and Verizon Wireless.)
An expert in change management and future trends, Uldrich will continue his discussion with RILA on how individuals in retail can enhance their awareness of transformational changes that are coming in retail. Highlights will include how retailers can learn to embrace ambiguity;” why finding a reverse mentor could be crucial; and why taking small risks may very well be the safest thing retailers can do to position themselves for success in the years to come.
With this particular keynote, Uldrich’s goal is to help his audience at RILA unlearn the barriers currently holding them back and unlock new levels of creativity and innovation. He will conclude his keynote by guiding participants through a series of tangible actions that will unleash their ability to create their own future and, in the process, help them achieve uncommon levels of success.
In his blog post, Unlearn…Just in Case, Uldrich says, “the global supply chain is an impressive feat of modern management. The problem is that in its quest to squeeze out ever greater efficiencies with its ‘just-in-time’ system of inventory, it has left itself extremely vulnerable to large, rare and unpredictable black swan events.”
The future “ain’t what it used to be” and Jack Uldrich has his finger on the pulse of what it may be. Parties interested in learning more about Jack, his books, his daily blog or his speaking availability are encouraged to visit his website. Media wishing to know more about either the event or interviewing Jack as a futurist or trend expert can contact Amy Tomczyk at (651) 343.0660.