Tag Archives: utility

Theres Plenty of Room at the Bottom Richard … – Zyvex

Posted: December 7, 2016 at 8:06 am

An Invitation to Enter a New Field of Physics

by Richard P. Feynman

This transcript of the classic talk that Richard Feynman gave on December 29th 1959 at the annual meeting of the American Physical Society at the California Institute of Technology (Caltech) was first published in Caltech Engineering and Science, Volume 23:5, February 1960, pp 22-36. It has been made available on the web at http://www.zyvex.com/nanotech/feynman.html with their kind permission. The scanned original is available.

The Wikipedia entry on Feynman’s talk.

Information on the Feynman Prizes

Search YouTube for Richard Feynman

For an account of the talk and how people reacted to it, see chapter 4 of Nano! by Ed Regis, Little/Brown 1995. An excellent technical introduction to nanotechnology is Nanosystems: molecular machinery, manufacturing, and computation by K. Eric Drexler, Wiley 1992.

The Feynman Lectures on Physics are available online.

I would like to describe a field, in which little has been done, but in which an enormous amount can be done in principle. This field is not quite the same as the others in that it will not tell us much of fundamental physics (in the sense of, “What are the strange particles?”) but it is more like solid-state physics in the sense that it might tell us much of great interest about the strange phenomena that occur in complex situations. Furthermore, a point that is most important is that it would have an enormous number of technical applications.

What I want to talk about is the problem of manipulating and controlling things on a small scale.

As soon as I mention this, people tell me about miniaturization, and how far it has progressed today. They tell me about electric motors that are the size of the nail on your small finger. And there is a device on the market, they tell me, by which you can write the Lord’s Prayer on the head of a pin. But that’s nothing; that’s the most primitive, halting step in the direction I intend to discuss. It is a staggeringly small world that is below. In the year 2000, when they look back at this age, they will wonder why it was not until the year 1960 that anybody began seriously to move in this direction.

Why cannot we write the entire 24 volumes of the Encyclopaedia Brittanica on the head of a pin?

Let’s see what would be involved. The head of a pin is a sixteenth of an inch across. If you magnify it by 25,000 diameters, the area of the head of the pin is then equal to the area of all the pages of the Encyclopaedia Brittanica. Therefore, all it is necessary to do is to reduce in size all the writing in the Encyclopaedia by 25,000 times. Is that possible? The resolving power of the eye is about 1/120 of an inch that is roughly the diameter of one of the little dots on the fine half-tone reproductions in the Encyclopaedia. This, when you demagnify it by 25,000 times, is still 80 angstroms in diameter 32 atoms across, in an ordinary metal. In other words, one of those dots still would contain in its area 1,000 atoms. So, each dot can easily be adjusted in size as required by the photoengraving, and there is no question that there is enough room on the head of a pin to put all of the Encyclopaedia Brittanica.

Furthermore, it can be read if it is so written. Let’s imagine that it is written in raised letters of metal; that is, where the black is in the Encyclopedia, we have raised letters of metal that are actually 1/25,000 of their ordinary size. How would we read it?

If we had something written in such a way, we could read it using techniques in common use today. (They will undoubtedly find a better way when we do actually have it written, but to make my point conservatively I shall just take techniques we know today.) We would press the metal into a plastic material and make a mold of it, then peel the plastic off very carefully, evaporate silica into the plastic to get a very thin film, then shadow it by evaporating gold at an angle against the silica so that all the little letters will appear clearly, dissolve the plastic away from the silica film, and then look through it with an electron microscope!

There is no question that if the thing were reduced by 25,000 times in the form of raised letters on the pin, it would be easy for us to read it today. Furthermore, there is no question that we would find it easy to make copies of the master; we would just need to press the same metal plate again into plastic and we would have another copy.

This method might be very slow because of space charge limitations. There will be more rapid methods. We could first make, perhaps by some photo process, a screen which has holes in it in the form of the letters. Then we would strike an arc behind the holes and draw metallic ions through the holes; then we could again use our system of lenses and make a small image in the form of ions, which would deposit the metal on the pin.

A simpler way might be this (though I am not sure it would work): We take light and, through an optical microscope running backwards, we focus it onto a very small photoelectric screen. Then electrons come away from the screen where the light is shining. These electrons are focused down in size by the electron microscope lenses to impinge directly upon the surface of the metal. Will such a beam etch away the metal if it is run long enough? I don’t know. If it doesn’t work for a metal surface, it must be possible to find some surface with which to coat the original pin so that, where the electrons bombard, a change is made which we could recognize later.

There is no intensity problem in these devices not what you are used to in magnification, where you have to take a few electrons and spread them over a bigger and bigger screen; it is just the opposite. The light which we get from a page is concentrated onto a very small area so it is very intense. The few electrons which come from the photoelectric screen are demagnified down to a very tiny area so that, again, they are very intense. I don’t know why this hasn’t been done yet!

That’s the Encyclopaedia Brittanica on the head of a pin, but let’s consider all the books in the world. The Library of Congress has approximately 9 million volumes; the British Museum Library has 5 million volumes; there are also 5 million volumes in the National Library in France. Undoubtedly there are duplications, so let us say that there are some 24 million volumes of interest in the world.

What would happen if I print all this down at the scale we have been discussing? How much space would it take? It would take, of course, the area of about a million pinheads because, instead of there being just the 24 volumes of the Encyclopaedia, there are 24 million volumes. The million pinheads can be put in a square of a thousand pins on a side, or an area of about 3 square yards. That is to say, the silica replica with the paper-thin backing of plastic, with which we have made the copies, with all this information, is on an area of approximately the size of 35 pages of the Encyclopaedia. That is about half as many pages as there are in this magazine. All of the information which all of mankind has ever recorded in books can be carried around in a pamphlet in your hand and not written in code, but as a simple reproduction of the original pictures, engravings, and everything else on a small scale without loss of resolution.

What would our librarian at Caltech say, as she runs all over from one building to another, if I tell her that, ten years from now, all of the information that she is struggling to keep track of 120,000 volumes, stacked from the floor to the ceiling, drawers full of cards, storage rooms full of the older books can be kept on just one library card! When the University of Brazil, for example, finds that their library is burned, we can send them a copy of every book in our library by striking off a copy from the master plate in a few hours and mailing it in an envelope no bigger or heavier than any other ordinary air mail letter.

Now, the name of this talk is “There is Plenty of Room at the Bottom” not just “There is Room at the Bottom.” What I have demonstrated is that there is room that you can decrease the size of things in a practical way. I now want to show that there is plenty of room. I will not now discuss how we are going to do it, but only what is possible in principle in other words, what is possible according to the laws of physics. I am not inventing anti-gravity, which is possible someday only if the laws are not what we think. I am telling you what could be done if the laws are what we think; we are not doing it simply because we haven’t yet gotten around to it.

Let us represent a dot by a small spot of one metal, the next dash by an adjacent spot of another metal, and so on. Suppose, to be conservative, that a bit of information is going to require a little cube of atoms 5 x 5 x 5 that is 125 atoms. Perhaps we need a hundred and some odd atoms to make sure that the information is not lost through diffusion, or through some other process.

I have estimated how many letters there are in the Encyclopaedia, and I have assumed that each of my 24 million books is as big as an Encyclopaedia volume, and have calculated, then, how many bits of information there are (1015). For each bit I allow 100 atoms. And it turns out that all of the information that man has carefully accumulated in all the books in the world can be written in this form in a cube of material one two-hundredth of an inch wide which is the barest piece of dust that can be made out by the human eye. So there is plenty of room at the bottom! Don’t tell me about microfilm!

This fact that enormous amounts of information can be carried in an exceedingly small space is, of course, well known to the biologists, and resolves the mystery which existed before we understood all this clearly, of how it could be that, in the tiniest cell, all of the information for the organization of a complex creature such as ourselves can be stored. All this information whether we have brown eyes, or whether we think at all, or that in the embryo the jawbone should first develop with a little hole in the side so that later a nerve can grow through it all this information is contained in a very tiny fraction of the cell in the form of long-chain DNA molecules in which approximately 50 atoms are used for one bit of information about the cell.

We have friends in other fields in biology, for instance. We physicists often look at them and say, “You know the reason you fellows are making so little progress?” (Actually I don’t know any field where they are making more rapid progress than they are in biology today.) “You should use more mathematics, like we do.” They could answer us but they’re polite, so I’ll answer for them: “What you should do in order for us to make more rapid progress is to make the electron microscope 100 times better.”

What are the most central and fundamental problems of biology today? They are questions like: What is the sequence of bases in the DNA? What happens when you have a mutation? How is the base order in the DNA connected to the order of amino acids in the protein? What is the structure of the RNA; is it single-chain or double-chain, and how is it related in its order of bases to the DNA? What is the organization of the microsomes? How are proteins synthesized? Where does the RNA go? How does it sit? Where do the proteins sit? Where do the amino acids go in? In photosynthesis, where is the chlorophyll; how is it arranged; where are the carotenoids involved in this thing? What is the system of the conversion of light into chemical energy?

It is very easy to answer many of these fundamental biological questions; you just look at the thing! You will see the order of bases in the chain; you will see the structure of the microsome. Unfortunately, the present microscope sees at a scale which is just a bit too crude. Make the microscope one hundred times more powerful, and many problems of biology would be made very much easier. I exaggerate, of course, but the biologists would surely be very thankful to you and they would prefer that to the criticism that they should use more mathematics.

The theory of chemical processes today is based on theoretical physics. In this sense, physics supplies the foundation of chemistry. But chemistry also has analysis. If you have a strange substance and you want to know what it is, you go through a long and complicated process of chemical analysis. You can analyze almost anything today, so I am a little late with my idea. But if the physicists wanted to, they could also dig under the chemists in the problem of chemical analysis. It would be very easy to make an analysis of any complicated chemical substance; all one would have to do would be to look at it and see where the atoms are. The only trouble is that the electron microscope is one hundred times too poor. (Later, I would like to ask the question: Can the physicists do something about the third problem of chemistry namely, synthesis? Is there a physical way to synthesize any chemical substance?

The reason the electron microscope is so poor is that the f- value of the lenses is only 1 part to 1,000; you don’t have a big enough numerical aperture. And I know that there are theorems which prove that it is impossible, with axially symmetrical stationary field lenses, to produce an f-value any bigger than so and so; and therefore the resolving power at the present time is at its theoretical maximum. But in every theorem there are assumptions. Why must the field be axially symmetrical? Why must the field be stationary? Can’t we have pulsed electron beams in fields moving up along with the electrons? Must the field be symmetrical? I put this out as a challenge: Is there no way to make the electron microscope more powerful?

There may even be an economic point to this business of making things very small. Let me remind you of some of the problems of computing machines. In computers we have to store an enormous amount of information. The kind of writing that I was mentioning before, in which I had everything down as a distribution of metal, is permanent. Much more interesting to a computer is a way of writing, erasing, and writing something else. (This is usually because we don’t want to waste the material on which we have just written. Yet if we could write it in a very small space, it wouldn’t make any difference; it could just be thrown away after it was read. It doesn’t cost very much for the material).

If I look at your face I immediately recognize that I have seen it before. (Actually, my friends will say I have chosen an unfortunate example here for the subject of this illustration. At least I recognize that it is a man and not an apple.) Yet there is no machine which, with that speed, can take a picture of a face and say even that it is a man; and much less that it is the same man that you showed it before unless it is exactly the same picture. If the face is changed; if I am closer to the face; if I am further from the face; if the light changes I recognize it anyway. Now, this little computer I carry in my head is easily able to do that. The computers that we build are not able to do that. The number of elements in this bone box of mine are enormously greater than the number of elements in our “wonderful” computers. But our mechanical computers are too big; the elements in this box are microscopic. I want to make some that are sub-microscopic.

If we wanted to make a computer that had all these marvelous extra qualitative abilities, we would have to make it, perhaps, the size of the Pentagon. This has several disadvantages. First, it requires too much material; there may not be enough germanium in the world for all the transistors which would have to be put into this enormous thing. There is also the problem of heat generation and power consumption; TVA would be needed to run the computer. But an even more practical difficulty is that the computer would be limited to a certain speed. Because of its large size, there is finite time required to get the information from one place to another. The information cannot go any faster than the speed of light so, ultimately, when our computers get faster and faster and more and more elaborate, we will have to make them smaller and smaller.

But there is plenty of room to make them smaller. There is nothing that I can see in the physical laws that says the computer elements cannot be made enormously smaller than they are now. In fact, there may be certain advantages.

But I would like to discuss, just for amusement, that there are other possibilities. Why can’t we manufacture these small computers somewhat like we manufacture the big ones? Why can’t we drill holes, cut things, solder things, stamp things out, mold different shapes all at an infinitesimal level? What are the limitations as to how small a thing has to be before you can no longer mold it? How many times when you are working on something frustratingly tiny like your wife’s wrist watch, have you said to yourself, “If I could only train an ant to do this!” What I would like to suggest is the possibility of training an ant to train a mite to do this. What are the possibilities of small but movable machines? They may or may not be useful, but they surely would be fun to make.

Consider any machine for example, an automobile and ask about the problems of making an infinitesimal machine like it. Suppose, in the particular design of the automobile, we need a certain precision of the parts; we need an accuracy, let’s suppose, of 4/10,000 of an inch. If things are more inaccurate than that in the shape of the cylinder and so on, it isn’t going to work very well. If I make the thing too small, I have to worry about the size of the atoms; I can’t make a circle out of “balls” so to speak, if the circle is too small. So, if I make the error, corresponding to 4/10,000 of an inch, correspond to an error of 10 atoms, it turns out that I can reduce the dimensions of an automobile 4,000 times, approximately so that it is 1 mm. across. Obviously, if you redesign the car so that it would work with a much larger tolerance, which is not at all impossible, then you could make a much smaller device.

It is interesting to consider what the problems are in such small machines. Firstly, with parts stressed to the same degree, the forces go as the area you are reducing, so that things like weight and inertia are of relatively no importance. The strength of material, in other words, is very much greater in proportion. The stresses and expansion of the flywheel from centrifugal force, for example, would be the same proportion only if the rotational speed is increased in the same proportion as we decrease the size. On the other hand, the metals that we use have a grain structure, and this would be very annoying at small scale because the material is not homogeneous. Plastics and glass and things of this amorphous nature are very much more homogeneous, and so we would have to make our machines out of such materials.

There are problems associated with the electrical part of the system with the copper wires and the magnetic parts. The magnetic properties on a very small scale are not the same as on a large scale; there is the “domain” problem involved. A big magnet made of millions of domains can only be made on a small scale with one domain. The electrical equipment won’t simply be scaled down; it has to be redesigned. But I can see no reason why it can’t be redesigned to work again.

This rapid heat loss would prevent the gasoline from exploding, so an internal combustion engine is impossible. Other chemical reactions, liberating energy when cold, can be used. Probably an external supply of electrical power would be most convenient for such small machines.

What would be the utility of such machines? Who knows? Of course, a small automobile would only be useful for the mites to drive around in, and I suppose our Christian interests don’t go that far. However, we did note the possibility of the manufacture of small elements for computers in completely automatic factories, containing lathes and other machine tools at the very small level. The small lathe would not have to be exactly like our big lathe. I leave to your imagination the improvement of the design to take full advantage of the properties of things on a small scale, and in such a way that the fully automatic aspect would be easiest to manage.

A friend of mine (Albert R. Hibbs) suggests a very interesting possibility for relatively small machines. He says that, although it is a very wild idea, it would be interesting in surgery if you could swallow the surgeon. You put the mechanical surgeon inside the blood vessel and it goes into the heart and “looks” around. (Of course the information has to be fed out.) It finds out which valve is the faulty one and takes a little knife and slices it out. Other small machines might be permanently incorporated in the body to assist some inadequately-functioning organ.

Now comes the interesting question: How do we make such a tiny mechanism? I leave that to you. However, let me suggest one weird possibility. You know, in the atomic energy plants they have materials and machines that they can’t handle directly because they have become radioactive. To unscrew nuts and put on bolts and so on, they have a set of master and slave hands, so that by operating a set of levers here, you control the “hands” there, and can turn them this way and that so you can handle things quite nicely.

Most of these devices are actually made rather simply, in that there is a particular cable, like a marionette string, that goes directly from the controls to the “hands.” But, of course, things also have been made using servo motors, so that the connection between the one thing and the other is electrical rather than mechanical. When you turn the levers, they turn a servo motor, and it changes the electrical currents in the wires, which repositions a motor at the other end.

Now, I want to build much the same device a master-slave system which operates electrically. But I want the slaves to be made especially carefully by modern large-scale machinists so that they are one-fourth the scale of the “hands” that you ordinarily maneuver. So you have a scheme by which you can do things at one- quarter scale anyway the little servo motors with little hands play with little nuts and bolts; they drill little holes; they are four times smaller. Aha! So I manufacture a quarter-size lathe; I manufacture quarter-size tools; and I make, at the one-quarter scale, still another set of hands again relatively one-quarter size! This is one-sixteenth size, from my point of view. And after I finish doing this I wire directly from my large-scale system, through transformers perhaps, to the one-sixteenth-size servo motors. Thus I can now manipulate the one-sixteenth size hands.

Well, you get the principle from there on. It is rather a difficult program, but it is a possibility. You might say that one can go much farther in one step than from one to four. Of course, this has all to be designed very carefully and it is not necessary simply to make it like hands. If you thought of it very carefully, you could probably arrive at a much better system for doing such things.

If you work through a pantograph, even today, you can get much more than a factor of four in even one step. But you can’t work directly through a pantograph which makes a smaller pantograph which then makes a smaller pantograph because of the looseness of the holes and the irregularities of construction. The end of the pantograph wiggles with a relatively greater irregularity than the irregularity with which you move your hands. In going down this scale, I would find the end of the pantograph on the end of the pantograph on the end of the pantograph shaking so badly that it wasn’t doing anything sensible at all.

At each stage, it is necessary to improve the precision of the apparatus. If, for instance, having made a small lathe with a pantograph, we find its lead screw irregular more irregular than the large-scale one we could lap the lead screw against breakable nuts that you can reverse in the usual way back and forth until this lead screw is, at its scale, as accurate as our original lead screws, at our scale.

We can make flats by rubbing unflat surfaces in triplicates together in three pairs and the flats then become flatter than the thing you started with. Thus, it is not impossible to improve precision on a small scale by the correct operations. So, when we build this stuff, it is necessary at each step to improve the accuracy of the equipment by working for awhile down there, making accurate lead screws, Johansen blocks, and all the other materials which we use in accurate machine work at the higher level. We have to stop at each level and manufacture all the stuff to go to the next level a very long and very difficult program. Perhaps you can figure a better way than that to get down to small scale more rapidly.

Yet, after all this, you have just got one little baby lathe four thousand times smaller than usual. But we were thinking of making an enormous computer, which we were going to build by drilling holes on this lathe to make little washers for the computer. How many washers can you manufacture on this one lathe?

Where am I going to put the million lathes that I am going to have? Why, there is nothing to it; the volume is much less than that of even one full-scale lathe. For instance, if I made a billion little lathes, each 1/4000 of the scale of a regular lathe, there are plenty of materials and space available because in the billion little ones there is less than 2 percent of the materials in one big lathe.

It doesn’t cost anything for materials, you see. So I want to build a billion tiny factories, models of each other, which are manufacturing simultaneously, drilling holes, stamping parts, and so on.

As we go down in size, there are a number of interesting problems that arise. All things do not simply scale down in proportion. There is the problem that materials stick together by the molecular (Van der Waals) attractions. It would be like this: After you have made a part and you unscrew the nut from a bolt, it isn’t going to fall down because the gravity isn’t appreciable; it would even be hard to get it off the bolt. It would be like those old movies of a man with his hands full of molasses, trying to get rid of a glass of water. There will be several problems of this nature that we will have to be ready to design for.

Up to now, we have been content to dig in the ground to find minerals. We heat them and we do things on a large scale with them, and we hope to get a pure substance with just so much impurity, and so on. But we must always accept some atomic arrangement that nature gives us. We haven’t got anything, say, with a “checkerboard” arrangement, with the impurity atoms exactly arranged 1,000 angstroms apart, or in some other particular pattern.

What could we do with layered structures with just the right layers? What would the properties of materials be if we could really arrange the atoms the way we want them? They would be very interesting to investigate theoretically. I can’t see exactly what would happen, but I can hardly doubt that when we have some control of the arrangement of things on a small scale we will get an enormously greater range of possible properties that substances can have, and of different things that we can do.

Consider, for example, a piece of material in which we make little coils and condensers (or their solid state analogs) 1,000 or 10,000 angstroms in a circuit, one right next to the other, over a large area, with little antennas sticking out at the other end a whole series of circuits. Is it possible, for example, to emit light from a whole set of antennas, like we emit radio waves from an organized set of antennas to beam the radio programs to Europe? The same thing would be to beam the light out in a definite direction with very high intensity. (Perhaps such a beam is not very useful technically or economically.)

I have thought about some of the problems of building electric circuits on a small scale, and the problem of resistance is serious. If you build a corresponding circuit on a small scale, its natural frequency goes up, since the wave length goes down as the scale; but the skin depth only decreases with the square root of the scale ratio, and so resistive problems are of increasing difficulty. Possibly we can beat resistance through the use of superconductivity if the frequency is not too high, or by other tricks.

Another thing we will notice is that, if we go down far enough, all of our devices can be mass produced so that they are absolutely perfect copies of one another. We cannot build two large machines so that the dimensions are exactly the same. But if your machine is only 100 atoms high, you only have to get it correct to one-half of one percent to make sure the other machine is exactly the same size namely, 100 atoms high!

At the atomic level, we have new kinds of forces and new kinds of possibilities, new kinds of effects. The problems of manufacture and reproduction of materials will be quite different. I am, as I said, inspired by the biological phenomena in which chemical forces are used in a repetitious fashion to produce all kinds of weird effects (one of which is the author).

The principles of physics, as far as I can see, do not speak against the possibility of maneuvering things atom by atom. It is not an attempt to violate any laws; it is something, in principle, that can be done; but in practice, it has not been done because we are too big.

Ultimately, we can do chemical synthesis. A chemist comes to us and says, “Look, I want a molecule that has the atoms arranged thus and so; make me that molecule.” The chemist does a mysterious thing when he wants to make a molecule. He sees that it has got that ring, so he mixes this and that, and he shakes it, and he fiddles around. And, at the end of a difficult process, he usually does succeed in synthesizing what he wants. By the time I get my devices working, so that we can do it by physics, he will have figured out how to synthesize absolutely anything, so that this will really be useless.

But it is interesting that it would be, in principle, possible (I think) for a physicist to synthesize any chemical substance that the chemist writes down. Give the orders and the physicist synthesizes it. How? Put the atoms down where the chemist says, and so you make the substance. The problems of chemistry and biology can be greatly helped if our ability to see what we are doing, and to do things on an atomic level, is ultimately developed a development which I think cannot be avoided.

Now, you might say, “Who should do this and why should they do it?” Well, I pointed out a few of the economic applications, but I know that the reason that you would do it might be just for fun. But have some fun! Let’s have a competition between laboratories. Let one laboratory make a tiny motor which it sends to another lab which sends it back with a thing that fits inside the shaft of the first motor.

Perhaps this doesn’t excite you to do it, and only economics will do so. Then I want to do something; but I can’t do it at the present moment, because I haven’t prepared the ground. It is my intention to offer a prize of $1,000 to the first guy who can take the information on the page of a book and put it on an area 1/25,000 smaller in linear scale in such manner that it can be read by an electron microscope.

And I want to offer another prize if I can figure out how to phrase it so that I don’t get into a mess of arguments about definitions of another $1,000 to the first guy who makes an operating electric motor a rotating electric motor which can be controlled from the outside and, not counting the lead-in wires, is only 1/64 inch cube.

I do not expect that such prizes will have to wait very long for claimants.

Continued here:

Theres Plenty of Room at the Bottom Richard … – Zyvex

Posted in Nanotech | Comments Off on Theres Plenty of Room at the Bottom Richard … – Zyvex

Automation: The Car Company Tycoon Game Windows – Mod DB

Posted: July 3, 2016 at 12:14 pm

This one has been a little more slow and complex to develop than expected, but after a long 3 months (4 really, but one was taken up with our first holiday in 4 years!) B1414 is now live for everyone. This update brings more complete car design aspects, along with car designer scenarios and a much improved user interface. There are also quite a few more new car bodies to base your designs from.

The next few months will be dedicated to some fairly unexciting but very important work, ready for release on Steam. We’ll be making major improvements aimed at making Automation more polished, easier to learn, and all around more professional looking. More improvements to UI and the process of model designing will make it much more logical and simple to design a big range of models based on one particular car. Pop-ups/Tooltips will be added for many thing, and tutorial videos will be done/redone covering every techincal part of the game. We’ll also be aiming to add a bunch of improved multiplayer modes, including Lap Time and Rally Stage Time challenge modes, with the ability to set your scores over the a few days, so you don’t need to all be online together to compete.This update will be the first one to release on Steam, which is an exciting milestone, and will hopefully bring in more sales allowing us to bring further people on to work on car body art and other new content, And after this update is out of the way, it’s finally time to start work on the Tycoon aspect of things!

Car Designer Features & Changes Added Mid Engine Cars Suspension Easy Mode Added Quality Sliders and dependencies for all car designer tabs Adjustable Rim Offsets Added Multilink Suspension Added the first automatic gearboxes Reliability and Environmental Resistance stats Passenger Space and Cargo Space stats Production Units, Costs and Service Costs stats Offroad and Utility stats Rebalanced Sportiness, Tameness, Comfort, Prestige and Safety calculations 9 Car Designer Scenarios Rebalanced material properties Many new part year dependencies Limited Cars to a Maximum of 2 wings, and 2 lips Added tire profile year limitation Base safety will stop progressing 10 years after a body first unlocks Bodies sorted by Year. Newest at the top Revised the Bottoming Out calculations to be less harsh Wings/Lips no longer punch holes in the body shell

Car Designer Fixes Fixed the crash caused by using MPH + dragging the top speed slider to top for high-revving engines (finally!) Fixed the Yaw Rate graph cut off when using mph as a unit for speed Fixed Certain cars not being able to complete a lap Fixed the proper gear delay being used on the test track Fixed Front Longitudinal AWD engine placement issues Fixed the sensitivity of resizing various fixtures, making it more responsive Fixed steamroller bug where wheels would become comically wide New Car Bodies Large 60s Coup 2 Large 70’s Coups Large 90’s Coup Large 60s Sedan Large 00’s Sedan Small 80s Supercar Small 10s Supercar Large 10s Supercar

UI & Sound Completely reworked UI and UI flow Car Design Wizard for the whole car design process All new UI sounds Ambient sounds Added test track soundsNew Car & Engine Manager Temporary changed the Platform/Model game mechanic Many more stats on the three different testing pages Updated graphs Updated test track UI Manual start for car testing on testing page Engine Designer Fixes / Rebalances Reduced power gain when riching up fuel mixture Octane rating in VVL systems uses the lower cam setting Added bypass valve year limitation Fixed the your engine was created in a previous version message bug Fixed bore and stroke having two decimals too few using imperial units Fixed a bug where loading a VVL engine set the wrong lower cam setting Fixed an engine loading bug that caused the block config lua error

General Things Changed MTBF to Reliability for less confusion New scenario scoring system implemented for car designer scenarios Fixed various aerodynamics calculations and exploits Changed all Man Hours to Production Units Added Console can be accessed by pressing tilde (~). Commands are help(), HideBuildings(), ShowBuildings(). Changed to saving screenshots as PNG. If you turn off FXAA, and use the HideBuildings() command, you can take pictures of engines/cars on a transparent back-drop. Useful for taking screenshots of Engines and Cars with no backdrops. Fixed the tutorial video sound cutting off after a minute Thumbnails are now deleted when you delete the model / engine it belongs to Many more little fixes

Read this article:

Automation: The Car Company Tycoon Game Windows – Mod DB

Posted in Automation | Comments Off on Automation: The Car Company Tycoon Game Windows – Mod DB

Biological warfare – Wikipedia, the free encyclopedia

Posted: June 12, 2016 at 8:25 pm

Biological warfare (BW)also known as germ warfareis the use of biological toxins or infectious agents such as bacteria, viruses, and fungi with the intent to kill or incapacitate humans, animals or plants as an act of war. Biological weapons (often termed “bio-weapons”, “biological threat agents”, or “bio-agents”) are living organisms or replicating entities (viruses, which are not universally considered “alive”) that reproduce or replicate within their host victims. Entomological (insect) warfare is also considered a type of biological weapon. This type of warfare is distinct from nuclear warfare and chemical warfare, which together with biological warfare make up NBC, the military acronym for nuclear, biological, and chemical warfare using weapons of mass destruction (WMDs). None of these are conventional weapons, which are primarily due to their explosive, kinetic, or incendiary potential.

Biological weapons may be employed in various ways to gain a strategic or tactical advantage over the enemy, either by threats or by actual deployments. Like some of the chemical weapons, biological weapons may also be useful as area denial weapons. These agents may be lethal or non-lethal, and may be targeted against a single individual, a group of people, or even an entire population. They may be developed, acquired, stockpiled or deployed by nation states or by non-national groups. In the latter case, or if a nation-state uses it clandestinely, it may also be considered bioterrorism.[1]

There is an overlap between biological warfare and chemical warfare, as the use of toxins produced by living organisms is considered under the provisions of both the Biological Weapons Convention and the Chemical Weapons Convention. Toxins and psychochemical weapons are often referred to as midspectrum agents. Unlike bioweapons, these midspectrum agents do not reproduce in their host and are typically characterized by shorter incubation periods.[2]

Offensive biological warfare, including mass production, stockpiling and use of biological weapons, was outlawed by the 1972 Biological Weapons Convention (BWC). The rationale behind this treaty, which has been ratified or acceded to by 170 countries as of April 2013,[3] is to prevent a biological attack which could conceivably result in large numbers of civilian casualties and cause severe disruption to economic and societal infrastructure.[citation needed] Many countries, including signatories of the BWC, currently pursue research into the defense or protection against BW, which is not prohibited by the BWC.

A nation or group that can pose a credible threat of mass casualty has the ability to alter the terms on which other nations or groups interact with it. Biological weapons allow for the potential to create a level of destruction and loss of life far in excess of nuclear, chemical or conventional weapons, relative to their mass and cost of development and storage. Therefore, biological agents may be useful as strategic deterrents in addition to their utility as offensive weapons on the battlefield.[4][5]

As a tactical weapon for military use, a significant problem with a BW attack is that it would take days to be effective, and therefore might not immediately stop an opposing force. Some biological agents (smallpox, pneumonic plague) have the capability of person-to-person transmission via aerosolized respiratory droplets. This feature can be undesirable, as the agent(s) may be transmitted by this mechanism to unintended populations, including neutral or even friendly forces. While containment of BW is less of a concern for certain criminal or terrorist organizations, it remains a significant concern for the military and civilian populations of virtually all nations.

Rudimentary forms of biological warfare have been practiced since antiquity.[6] During the 6th century BC, the Assyrians poisoned enemy wells with a fungus that would render the enemy delirious. In 1346, the bodies of Mongol warriors of the Golden Horde who had died of plague were thrown over the walls of the besieged Crimean city of Kaffa. Specialists disagree over whether this operation may have been responsible for the spread of the Black Death into Europe.[7][8][9][10]

It has been claimed that the British Marines used smallpox in New South Wales in 1789.[11] Historians have long debated inconclusively whether the British Army used smallpox in an episode against Native Americans in 1763.[12]

By 1900 the germ theory and advances in bacteriology brought a new level of sophistication to the techniques for possible use of bio-agents in war. Biological sabotagein the form of anthrax and glanderswas undertaken on behalf of the Imperial German government during World War I (19141918), with indifferent results.[13] The Geneva Protocol of 1925 prohibited the use of chemical weapons and biological weapons.

With the onset of World War II, the Ministry of Supply in the United Kingdom established a BW program at Porton Down, headed by the microbiologist Paul Fildes. The research was championed by Winston Churchill and soon tularemia, anthrax, brucellosis, and botulism toxins had been effectively weaponized. In particular, Gruinard Island in Scotland, during a series of extensive tests was contaminated with anthrax for the next 56 years. Although the UK never offensively used the biological weapons it developed on its own, its program was the first to successfully weaponize a variety of deadly pathogens and bring them into industrial production.[14]

When the USA entered the war, mounting British pressure for the creation of a similar research program for an Allied pooling of resources, led to the creation of a large industrial complex at Fort Detrick, Maryland in 1942 under the direction of George W. Merck.[15] The biological and chemical weapons developed during that period were tested at the Dugway Proving Grounds in Utah. Soon there were facilities for the mass production of anthrax spores, brucellosis, and botulism toxins, although the war was over before these weapons could be of much operational use.[16]

The most notorious program of the period was run by the secret Imperial Japanese Army Unit 731 during the war, based at Pingfan in Manchuria and commanded by Lieutenant General Shir Ishii. This unit did research on BW, conducted often fatal human experiments on prisoners, and produced biological weapons for combat use.[17] Although the Japanese effort lacked the technological sophistication of the American or British programs, it far outstripped them in its widespread application and indiscriminate brutality. Biological weapons were used against both Chinese soldiers and civilians in several military campaigns.[18] In 1940, the Japanese Army Air Force bombed Ningbo with ceramic bombs full of fleas carrying the bubonic plague.[19] Many of these operations were ineffective due to inefficient delivery systems,[17] although up to 400,000 people may have died.[20] During the Zhejiang-Jiangxi Campaign in 1942, around 1,700 Japanese troops died out of a total 10,000 Japanese soldiers who fell ill with disease when their own biological weapons attack rebounded on their own forces.[21][22]

During the final months of World War II, Japan planned to use plague as a biological weapon against U.S. civilians in San Diego, California, during Operation Cherry Blossoms at Night. The plan was set to launch on 22 September 1945, but it was not executed because of Japan’s surrender on 15 August 1945.[23][24][25][26]

In Britain, the 1950s saw the weaponization of plague, brucellosis, tularemia and later equine encephalomyelitis and vaccinia viruses, but the programme was unilaterally cancelled in 1956. The United States Army Biological Warfare Laboratories weaponized anthrax, tularemia, brucellosis, Q-fever and others.

In 1969, the UK and the Warsaw Pact, separately, introduced proposals to the UN to ban biological weapons, and US President Richard Nixon terminated production of biological weapons, allowing only scientific research for defensive measures. The Biological and Toxin Weapons Convention was signed by the US, UK, USSR and other nations, as a ban on “development, production and stockpiling of microbes or their poisonous products except in amounts necessary for protective and peaceful research” in 1972. However, the Soviet Union continued research and production of massive offensive biological weapons in a program called Biopreparat, despite having signed the convention.[27] By 2011, 165 countries had signed the treaty and none are proventhough nine are still suspected[28]to possess offensive BW programs.[28]

It has been argued that rational state actors would never use biological weapons offensively. The argument is that biological weapons cannot be controlled: the weapon could backfire and harm the army on the offensive, perhaps having even worse effects than on the target. An agent like smallpox or other airborne viruses would almost certainly spread worldwide and ultimately infect the user’s home country. However, this argument does not necessarily apply to bacteria. For example, anthrax can easily be controlled and even created in a garden shed; the FBI suspects it can be done for as little as $2,500 using readily available laboratory equipment.[29] Also, using microbial methods, bacteria can be suitably modified to be effective in only a narrow environmental range, the range of the target that distinctly differs from the army on the offensive. Thus only the target might be affected adversely. The weapon may be further used to bog down an advancing army making them more vulnerable to counterattack by the defending force.

Ideal characteristics of a biological agent to be used as a weapon against humans are high infectivity, high virulence, non-availability of vaccines, and availability of an effective and efficient delivery system. Stability of the weaponized agent (ability of the agent to retain its infectivity and virulence after a prolonged period of storage) may also be desirable, particularly for military applications, and the ease of creating one is often considered. Control of the spread of the agent may be another desired characteristic.

The primary difficulty is not the production of the biological agent, as many biological agents used in weapons can often be manufactured relatively quickly, cheaply and easily. Rather, it is the weaponization, storage and delivery in an effective vehicle to a vulnerable target that pose significant problems.

For example, Bacillus anthracis is considered an effective agent for several reasons. First, it forms hardy spores, perfect for dispersal aerosols. Second, this organism is not considered transmissible from person to person, and thus rarely if ever causes secondary infections. A pulmonary anthrax infection starts with ordinary influenza-like symptoms and progresses to a lethal hemorrhagic mediastinitis within 37 days, with a fatality rate that is 90% or higher in untreated patients.[30] Finally, friendly personnel can be protected with suitable antibiotics.

A large-scale attack using anthrax would require the creation of aerosol particles of 1.5 to 5m: larger particles would not reach the lower respiratory tract, while smaller particles would be exhaled back out into the atmosphere. At this size, conductive powders tend to aggregate because of electrostatic charges, hindering dispersion. So the material must be treated to insulate and neutralize the charges. The weaponized agent must be resistant to degradation by rain and ultraviolet radiation from sunlight, while retaining the ability to efficiently infect the human lung. There are other technological difficulties as well, chiefly relating to storage of the weaponized agent.

Agents considered for weaponization, or known to be weaponized, include bacteria such as Bacillus anthracis, Brucella spp., Burkholderia mallei, Burkholderia pseudomallei, Chlamydophila psittaci, Coxiella burnetii, Francisella tularensis, some of the Rickettsiaceae (especially Rickettsia prowazekii and Rickettsia rickettsii), Shigella spp., Vibrio cholerae, and Yersinia pestis. Many viral agents have been studied and/or weaponized, including some of the Bunyaviridae (especially Rift Valley fever virus), Ebolavirus, many of the Flaviviridae (especially Japanese encephalitis virus), Machupo virus, Marburg virus, Variola virus, and Yellow fever virus. Fungal agents that have been studied include Coccidioides spp..[31][32]

Toxins that can be used as weapons include ricin, staphylococcal enterotoxin B, botulinum toxin, saxitoxin, and many mycotoxins. These toxins and the organisms that produce them are sometimes referred to as select agents. In the United States, their possession, use, and transfer are regulated by the Centers for Disease Control and Prevention’s Select Agent Program.

The former US biological warfare program categorized its weaponized anti-personnel bio-agents as either Lethal Agents (Bacillus anthracis, Francisella tularensis, Botulinum toxin) or Incapacitating Agents (Brucella suis, Coxiella burnetii, Venezuelan equine encephalitis virus, Staphylococcal enterotoxin B).

The United States developed an anti-crop capability during the Cold War that used plant diseases (bioherbicides, or mycoherbicides) for destroying enemy agriculture. Biological weapons also target fisheries as well as water-based vegetation. It was believed that destruction of enemy agriculture on a strategic scale could thwart Sino-Soviet aggression in a general war. Diseases such as wheat blast and rice blast were weaponized in aerial spray tanks and cluster bombs for delivery to enemy watersheds in agricultural regions to initiate epiphytotics (epidemics among plants). When the United States renounced its offensive biological warfare program in 1969 and 1970, the vast majority of its biological arsenal was composed of these plant diseases.[citation needed] Enterotoxins and Mycotoxins were not affected by Nixon’s order.

Though herbicides are chemicals, they are often grouped with biological warfare and chemical warfare because they may work in a similar manner as biotoxins or bioregulators. The Army Biological Laboratory tested each agent and the Army’s Technical Escort Unit was responsible for transport of all chemical, biological, radiological (nuclear) materials. Scorched earth tactics or destroying livestock and farmland were carried out in the Vietnam war (cf. Agent Orange)[33] and Eelam War in Sri Lanka.[citation needed]

Biological warfare can also specifically target plants to destroy crops or defoliate vegetation. The United States and Britain discovered plant growth regulators (i.e., herbicides) during the Second World War, and initiated a herbicidal warfare program that was eventually used in Malaya and Vietnam in counterinsurgency operations.

In 1980s Soviet Ministry of Agriculture had successfully developed variants of foot-and-mouth disease, and rinderpest against cows, African swine fever for pigs, and psittacosis to kill chicken. These agents were prepared to spray them down from tanks attached to airplanes over hundreds of miles. The secret program was code-named “Ecology”.[31]

Attacking animals is another area of biological warfare intended to eliminate animal resources for transportation and food. In the First World War, German agents were arrested attempting to inoculate draft animals with anthrax, and they were believed to be responsible for outbreaks of glanders in horses and mules. The British tainted small feed cakes with anthrax in the Second World War as a potential means of attacking German cattle for food denial, but never employed the weapon. In the 1950s, the United States had a field trial with hog cholera.[citation needed] During the Mau Mau Uprising in 1952, the poisonous latex of the African milk bush was used to kill cattle.[34]

Outside the context of war, humans have deliberately introduced the rabbit disease Myxomatosis, originating in South America, to Australia and Europe, with the intention of reducing the rabbit population which had devastating but temporary results, with wild rabbit populations reduced to a fraction of their former size but survivors developing immunity and increasing again.

Entomological warfare (EW) is a type of biological warfare that uses insects to attack the enemy. The concept has existed for centuries and research and development have continued into the modern era. EW has been used in battle by Japan and several other nations have developed and been accused of using an entomological warfare program. EW may employ insects in a direct attack or as vectors to deliver a biological agent, such as plague. Essentially, EW exists in three varieties. One type of EW involves infecting insects with a pathogen and then dispersing the insects over target areas.[35] The insects then act as a vector, infecting any person or animal they might bite. Another type of EW is a direct insect attack against crops; the insect may not be infected with any pathogen but instead represents a threat to agriculture. The final method uses uninfected insects, such as bees, wasps, etc., to directly attack the enemy.[36]

In 2010 at The Meeting of the States Parties to the Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and Their Destruction in Geneva[37] the sanitary epidemiological reconnaissance was suggested as well-tested means for enhancing the monitoring of infections and parasitic agents, for practical implementation of the International Health Regulations (2005). The aim was to prevent and minimize the consequences of natural outbreaks of dangerous infectious diseases as well as the threat of alleged use of biological weapons against BTWC States Parties.

It is important to note that most classical and modern biological weapons’ pathogens can be obtained from a plant or an animal which is naturally infected.[38]

Indeed, in the largest biological weapons accident known the anthrax outbreak in Sverdlovsk (now Yekaterinburg) in the Soviet Union in 1979, sheep became ill with anthrax as far as 200 kilometers from the release point of the organism from a military facility in the southeastern portion of the city and still off limits to visitors today, see Sverdlovsk Anthrax leak).[39]

Thus, a robust surveillance system involving human clinicians and veterinarians may identify a bioweapons attack early in the course of an epidemic, permitting the prophylaxis of disease in the vast majority of people (and/or animals) exposed but not yet ill.

For example, in the case of anthrax, it is likely that by 2436 hours after an attack, some small percentage of individuals (those with compromised immune system or who had received a large dose of the organism due to proximity to the release point) will become ill with classical symptoms and signs (including a virtually unique chest X-ray finding, often recognized by public health officials if they receive timely reports).[40] The incubation period for humans is estimated to be about 11.8 days to 12.1 days. This suggested period is the first model that is independently consistent with data from the largest known human outbreak. These projections refines previous estimates of the distribution of early onset cases after a release and supports a recommended 60-day course of prophylactic antibiotic treatment for individuals exposed to low doses of anthrax.[41] By making these data available to local public health officials in real time, most models of anthrax epidemics indicate that more than 80% of an exposed population can receive antibiotic treatment before becoming symptomatic, and thus avoid the moderately high mortality of the disease.[40]

From most specific to least specific:[42]

1. Single cause of a certain disease caused by an uncommon agent, with lack of an epidemiological explanation.

2. Unusual, rare, genetically engineered strain of an agent.

3. High morbidity and mortality rates in regards to patients with the same or similar symptoms.

4. Unusual presentation of the disease.

5. Unusual geographic or seasonal distribution.

6. Stable endemic disease, but with an unexplained increase in relevance.

7. Rare transmission (aerosols, food, water).

8. No illness presented in people who were/are not exposed to “common ventilation systems (have separate closed ventilation systems) when illness is seen in persons in close proximity who have a common ventilation system.”

9. Different and unexplained diseases coexisting in the same patient without any other explanation.

10. Rare illness that affects a large, disparate population (respiratory disease might suggest the pathogen or agent was inhaled).

11. Illness is unusual for a certain population or age-group in which it takes presence.

12. Unusual trends of death and/or illness in animal populations, previous to or accompanying illness in humans.

13. Many effected reaching out for treatment at the same time.

14. Similar genetic makeup of agents in effected individuals.

15. Simultaneous collections of similar illness in non-contiguous areas, domestic, or foreign.

16. An abundance of cases of unexplained diseases and deaths.

The goal of biodefense is to integrate the sustained efforts of the national and homeland security, medical, public health, intelligence, diplomatic, and law enforcement communities. Health care providers and public health officers are among the first lines of defense. In some countries private, local, and provincial (state) capabilities are being augmented by and coordinated with federal assets, to provide layered defenses against biological weapon attacks. During the first Gulf War the United Nations activated a biological and chemical response team, Task Force Scorpio, to respond to any potential use of weapons of mass destruction on civilians.

The traditional approach toward protecting agriculture, food, and water: focusing on the natural or unintentional introduction of a disease is being strengthened by focused efforts to address current and anticipated future biological weapons threats that may be deliberate, multiple, and repetitive.

The growing threat of biowarfare agents and bioterrorism has led to the development of specific field tools that perform on-the-spot analysis and identification of encountered suspect materials. One such technology, being developed by researchers from the Lawrence Livermore National Laboratory (LLNL), employs a “sandwich immunoassay”, in which fluorescent dye-labeled antibodies aimed at specific pathogens are attached to silver and gold nanowires.[43]

In the Netherlands, the company TNO has designed Bioaerosol Single Particle Recognition eQuipment (BiosparQ). This system would be implemented into the national response plan for bioweapon attacks in the Netherlands.[44]

Researchers at Ben Gurion University in Israel are developing a different device called the BioPen, essentially a “Lab-in-a-Pen”, which can detect known biological agents in under 20 minutes using an adaptation of the ELISA, a similar widely employed immunological technique, that in this case incorporates fiber optics.[45]

Theoretically, novel approaches in biotechnology, such as synthetic biology could be used in the future to design novel types of biological warfare agents.[46][47][48][49] Special attention has to be laid on future experiments (of concern) that:[50]

Most of the biosecurity concerns in synthetic biology, however, are focused on the role of DNA synthesis and the risk of producing genetic material of lethal viruses (e.g. 1918 Spanish flu, polio) in the lab.[51][52][53] Recently, the CRISPR/Cas system has emerged as a promising technique for gene editing. It was hailed by The Washington Post as “the most important innovation in the synthetic biology space in nearly 30 years.”[54] While other methods take months or years to edit gene sequences, CRISPR speeds that time up to weeks.[54] However, due to its ease of use and accessibility, it has raised a number of ethical concerns, especially surrounding its use in the biohacking space.[54][55][56]

(passim)

Read this article:

Biological warfare – Wikipedia, the free encyclopedia

Posted in Germ Warfare | Comments Off on Biological warfare – Wikipedia, the free encyclopedia

Bitcoin Magazine | Bitcoin and Cryptocurrency News

Posted: October 4, 2015 at 4:43 pm

Dutch National Police Set Sights on Blockchain-Based Cloud Services View more How the Hunt for Satoshi Turned Dorian Nakamotos Life Upside Down: the Inside Story View more Bitcoin Used to Pay Utility and Credit Card Bills in the Philippines and Australia View more Building a Risk Market for the Digital Age Using Bitcoin View more WEF Survey Predicts Bitcoin ‘Tipping Point’ Happening By 2027 View more Australian Startups Close Down as Banks End Support for Bitcoin View more Beyond Bitcoin: How the Blockchain Can Power a New Generation of Enterprise Software View more R3 Blockchain Development Initiative Grows to 22 Banks Worldwide View more The Decentralist Perspective, or Why Bitcoin Might Need Small Blocks View more Everything You Need to Know about the Proposed Changes to the Bitcoin Block Size Cap View more Gavin Andresen on the Block Size: Its Hard to Find a Signal above All the Noise View more Bitcoin XT Big Block Fork Spurs Debate and Controversy View more Transaction Remote Release, a Tor-Inspired Proposed Change to Bitcoin Protocol for Anonymous Transactions View more Vaultoro Seeks to Provide a Store of Value to the Underbanked World Using Gold and Bitcoin View more Bank of America Files Patent Application for Cryptocurrency-Mediated Wire Transfers View more Bitcoin Tracker One ETN Offers Liquidity to European Investors View more NeuCoin Launches a New Digital Currency for Online and Mobile Gaming View more Digital Currency Derivatives Exchanges Prepare for Regulation after CFTC Bitcoin Ruling View more Nine Top Global Banks Pool Resources to Fund R3 to Develop Digital Currency Standards View more UBS to Develop Yet Another Permissioned Blockchain for Banks View more Blythe Masters and Wall Street Opt for Permissioned Non-Bitcoin Blockchains View more Nick Szabo on Permissioned Blockchains and the Block Size View more Bitcoin Derivatives Company LedgerX Appoints Ex-CFTC Commissioner Wetjen to Board View more Bitcoin Hardware Wallet KeepKey Launches and Begins Shipping View more Three Bitcoin Finalists Vie for BBVA Open Talent Competition Honors in Barcelona View more Australian Regulators Investigating Banks for Closing Accounts of Bitcoin Companies View more Bank of England Chief Economist: Blockchain-based Digital Currency Issued by Central Banks Could Replace Cash View more Sig3 Launches an Automated Policy-Based Transaction Cosigner for Multisig Bitcoin Transactions View more BitGo Processes Over $1 Billion in Bitcoin Transactions in Third Quarter View more Storj Network Passes 1 Petabyte Storage Space View more Bitcoin and Gold Exchange Vaultoro Reaches $1 Million in Gold Trading Volume View more

Read more from the original source:
Bitcoin Magazine | Bitcoin and Cryptocurrency News

Posted in Bitcoin | Comments Off on Bitcoin Magazine | Bitcoin and Cryptocurrency News

Jitsi, ostel.co and ISP censorship | The Guardian Project

Posted: August 30, 2015 at 7:46 pm

Earlier last week n8fr8 suspected something changed on the ostel.co server, due to many users emailing support specifically about Jitsi connectivity to ostel.co. The common question was why did it work a few weeks ago and now it doesnt anymore?

The tl;dr follows, skip to keyword CONCLUSION to hear only the punch line.

To support n8fr8s hypothesis, there was a small change to the server but I want convinced it effected anything since all my clients continued to work properly, including Jitsi. Obviously something had changed but none of us knew what it was. After some testing we discovered the problem was related to insecure connections from Jitsi to UDP port 5060 on ostel.co. Secure connections (on TCP port 5061) continued to work as expected.

To make matters more confusing, I could register and make calls with two different clients (CSipSimple and Linphone) on the same network (my home ISP, Verizon FiOS) using an insecure connection to ostel.co on UDP port 5060.

At this point I was like WTF?

I went back to the server, diffed all the configs, checked server versions, connected with every client I could find that would run on any of my computers. The only change was a Kamailio upgrade from 4.0.1 to 4.0.2. A minor point release. The problem with Jitsi remained. What could the server be doing to this poor client?

I did a packet trace on the ostel.co servers public network interface, filtered to dump packets only on UDP port 5060 that match my SIP username. I opened Jitsi and things got interesting. For the curious, heres the utility and options I used. If you are new to operating a SIP network, ngrep is an excellent tool for debugging.

ngrep -d eth0 -t -p -W byline foo port 5060

Ill include an excerpt (Ive included only the relevant headers for this issue) of the initial request from Jitsi. IP addresses and usernames have been changed to protect the innocent.

U 2013/07/19 22:17:34.920749 0.0.0.0:5060 -> 66.151.32.200:5060 REGISTER sip:ostel.co SIP/2.0. CSeq: 1 REGISTER. From: “foo” ;tag=1eb3467e. To: “foo” . Via: SIP/2.0/UDP 0.0.0.0:49152;branch=z9hG4bK-393535-2269e43afef0b312554eb419a8d0540e. User-Agent: Jitsi2.3.4752Linux. Contact: “foo” ;expires=600. .

# U 2013/07/19 22:17:34.921155 66.151.32.200:5060 -> 0.0.0.0:5060 SIP/2.0 401 Unauthorized. CSeq: 1 REGISTER. From: foo ;tag=1eb3467e. To: foo ;tag=e01f0de2cdfebbeefc5ff0c8eabbb8b3.2f1f. Via: SIP/2.0/UDP 0.0.0.0:49152;branch=z9hG4bK-393535-2269e43afef0b312554eb419a8d0540e;rport=5060. WWW-Authenticate: Digest realm=ostel.co, nonce=Uen0alHp8z4d6ePDl83RtMwARltAxzQu, qop=auth. Server: kamailio (4.0.2 (x86_64/linux)).

If you read the response, youll see Kamailio sent 401 Unauthorized. This is normal for SIP authentication. A second client request should follow it, which should contain an Authorization header with an md5 and a nonce. When Kamailio receives this request, checks the auth database and sends a 200 OK response, the client is authenticated.

The SIP dialog looks good but Jitsi continues not to register. The dialog flow is cut off after the 401 Unauthorized response. Its almost like something has blocked the response to the client.

Since I could register Linphone using the same account, I did the same trace for that client. Heres the excerpt.

U 2013/07/19 22:33:18.372770 0.0.0.0:42680 -> 66.151.32.200:5060 REGISTER sip:ostel.co SIP/2.0. Via: SIP/2.0/UDP 0.0.0.0:49153;rport;branch=z9hG4bK359459505. From: ;tag=142131416. To: . CSeq: 3 REGISTER. Contact: . User-Agent: LinphoneAndroid/2.1.2-1-g23b7fc0 (eXosip2/3.6.0). .

# U 2013/07/19 22:33:18.373112 66.151.32.200:5060 -> 0.0.0.0:42680 SIP/2.0 401 Unauthorized. Via: SIP/2.0/UDP 0.0.0.0:49153;rport=42680;branch=z9hG4bK359459505. From: ;tag=142131416. To: ;tag=e01f0de2cdfebbeefc5ff0c8eabbb8b3.4065. CSeq: 3 REGISTER. WWW-Authenticate: Digest realm=ostel.co, nonce=Uen4GlHp9u4FwHNY/uE1iQQNCfGHJiob, qop=auth. Server: kamailio (4.0.2 (x86_64/linux)).

This 401 Unauthorized response was received by the client and the follow up request with the Authorization header was sent with the correct digest. Linphone registered. I made a call. Everything worked fine. Indeed WTF?

I stared at these traces for a while to get a clue. Look again at the first line of the request from Jitsi. Youll see a timestamp followed by two IP:port pairs. Notice the port on the first IP is 5060 and the port on the second IP is also 5060. This means that the source port used by Jitsi on my home network is UDP port 5060. In order for a response to come back to Jitsi, it must enter my network on the same port it exited. Now read the top line of the response from Kamailio. Indeed, the server sent the response to UDP port 5060.

Now look at the same flow for Linphone. There is a very different source port in that dialog. In this case, Kamailio sent the response to UDP port 42680 and Linphone received it. Also notice the IP address used by Kamailio as the destination of the response is the same one in the dialog from Jitsi.

The question remained, why cant Jitsi get the same kind of SIP response on UDP port 5060? Why is Jitsi using a single source port for outgoing traffic anyway? That value can be dynamic. I configured Jitsi to use a different port for insecure SIP. It has an advanced configuration for SIP with the key SIP client port. I set this to 5062 (5061 is conventionally used for secure SIP traffic so I incremented by 2) and tried to register again.

SUCCESSSSSSSSSSSS!

To be thorough, I changed Jitsis SIP port again to a 5 digit number I randomly typed on my keyboard without looking.

SUCCESSSSSSSSSSSS!

So if Jitsi can register to Kamailio on any port other than UDP port 5060, WTF is going on? I had a suspicion. I tried one more test before I called it. I configured Jitsi to connect on TCP port 5060. It registered successfully. Now I know whats going on. I have a sad

CONCLUSION

My ISP, Verizon FiOS, has a firewall running somewhere upstream (it could be on the router they provided, I havent checked yet) that blocks incoming UDP traffic to port 5060. This probably falls under their TOS section which forbids running servers since Verizon provides voice services for an additional fee on top of data service, despite both running over the same fiber connection to my house. It seems like Verizon doesnt want their data-only customers to get in the way of that sweet cheddar delivery each month in exchange for phone service.

This sucks on two levels.

LEVEL 1

Why is my ISP censoring my incoming traffic when I have 5 mbps of incoming bandwidth? I assume the answer is because they can. *desolate frowny face*

LEVEL 2

Why doesnt Jitsi use a dynamic source port for SIP requests? I assume the answer is Jitsi is open source, why dont I change this and send a patch upstream?

Both levels are formidable challenges to overcome. Convincing Verizon to play nice on the Internet feels like a vanity project. Im writing that off. To make a change to the SIP stack in Jitsi is well within the area of the GP teams expertise, myself included but its not a trivial undertaking. Since this is a default configuration change there is probably a reason upstream devs made this choice so in addition to the programming work theres the work to convince the developers this would be a change worth a new release.

Since this is specific to Jitsi, Im going to follow up with the developers and see if I missed anything. Stay tuned for part two.

Thanks for listening. Stay safe!

Read the original:
Jitsi, ostel.co and ISP censorship | The Guardian Project

Posted in Jitsi | Comments Off on Jitsi, ostel.co and ISP censorship | The Guardian Project

Global Futurist Jack Uldrich to Discuss Latest Technological Trends with RILA

Posted: February 25, 2015 at 12:40 am

Orlando, FLA (PRWEB) February 24, 2015

Yogi Berra once famously said, The future aint what it used to be. And he was right. In fact, according to trend expert and keynote speaker Jack Uldrich, the future “is going to be downright unusual.” This begs the obvious question: How do organizations prepare for an uncertain and unpredictable future? The answer, says Uldrich,” is that leaders and their organizations must think and act in unorthodox ways.”

Uldrich, who delivered a keynote to executives of the Retail Industry Leaders Association (RILA) at the “All Channels. All Challenges. One Conference” last April, will address the group again today, February 24th. He will deliver his keynote: “Business as Unusual: How Future Trends Will Transform the Supply Chain of Tomorrow.” (Some of Uldrich’s other clients in retail and supply management include the Women’s Food Forum, TRUNO, the Food Marketing Institute, GameStop’s Executive Summit, Utility Supply Management Association, and Verizon Wireless.)

An expert in change management and future trends, Uldrich will continue his discussion with RILA on how individuals in retail can enhance their awareness of transformational changes that are coming in retail. Highlights will include how retailers can learn to embrace ambiguity;” why finding a reverse mentor could be crucial; and why taking small risks may very well be the safest thing retailers can do to position themselves for success in the years to come.

With this particular keynote, Uldrich’s goal is to help his audience at RILA unlearn the barriers currently holding them back and unlock new levels of creativity and innovation. He will conclude his keynote by guiding participants through a series of tangible actions that will unleash their ability to create their own future and, in the process, help them achieve uncommon levels of success.

In his blog post, Unlearn…Just in Case, Uldrich says, “the global supply chain is an impressive feat of modern management. The problem is that in its quest to squeeze out ever greater efficiencies with its ‘just-in-time’ system of inventory, it has left itself extremely vulnerable to large, rare and unpredictable black swan events.”

The future “ain’t what it used to be” and Jack Uldrich has his finger on the pulse of what it may be. Parties interested in learning more about Jack, his books, his daily blog or his speaking availability are encouraged to visit his website. Media wishing to know more about either the event or interviewing Jack as a futurist or trend expert can contact Amy Tomczyk at (651) 343.0660.

Visit link:
Global Futurist Jack Uldrich to Discuss Latest Technological Trends with RILA

Posted in Futurist | Comments Off on Global Futurist Jack Uldrich to Discuss Latest Technological Trends with RILA

Liberty Utilities to build water-recharge plant in Goodyear

Posted: December 1, 2014 at 11:45 am

Water.(Photo: Getty Images/iStockphoto)

A private water company wants to build a 57-acre water-recharge facility and office complex in Goodyear near Luke Air Force Base and across the street from Litchfield Park.

Liberty Utilities and the Central Arizona Groundwater Replenishment District have partnered to build the state’s first public-private reclaimed-water recharge facility. The groundwater replenishment district will pay for about $6 million of the potential $8 million project, and Liberty will pay the remaining $2 million, said Greg Sorensen, president of Liberty Utilities’ Arizona and Texas divisions.

Liberty Utilities serves more than 17,500 water and wastewater customers in Maricopa County, mostly Goodyear residents who live north of Interstate 10 and Litchfield Park residents.

READ MORE: West Valley cities eyeing reclaimed water amid drought, population growth

The Central Arizona Groundwater Replenishment District is the groundwater replenishment authority of the Central Arizona Water Conservation District commonly known as the Central Arizona Project.

The project will impact water rates, though by how much isn’t known. Ratepayers will pick up the $2 million cost to Liberty, Sorensen said.

“Anytime we invest money into utility assets, there ultimately will be an impact on rates,” Sorensen said. “But because we are able to work with (the conservation district) and obtain $6 million in funding, that’s $6 million we don’t have to invest in the utility.”

Construction of the project will begin next year and will be completed sometime in mid- to late-2016, depending on permitting, Sorensen said.

The proposed plant, north of Camelback Road between Sun Health and the Falcon Dunes Golf Course, will consist of a 15,000-square-foot office building and potentially six recharge basins built to hold reclaimed water that will percolate into the ground and the aquifer.

The rest is here:
Liberty Utilities to build water-recharge plant in Goodyear

Posted in Liberty | Comments Off on Liberty Utilities to build water-recharge plant in Goodyear

Internet and Human Capability: A Study In Parallel Evolution

Posted: November 7, 2014 at 7:41 am

Evolution is a funny thing. All organic creatures evolve in response to changes in their environment. And then in turn, the environment changes in response to new behaviors from the organisms that inhabit it. The same dynamic applies to the Internet and the people who use it. Innovation begets behavioral change. Behavioral change inspires new innovation.

But what happens when pace of environmental change begins to outpace human change? What happens when the Internet experiences such massive new strains on it from an exponential increase in the data, applications and interactions that have grown so dependent on it? And what must the industry do to prepare the network for this shift and ensure people can continue to take advantage of emerging technological capabilities?

The changes in our digital environment are stunning. According to the research firm Telegeography, global Internet capacity has reached 77 Tbps, more than doubling between 2010 and 2012. A lot of this growth is driven by the popularity of social media and high-bandwidth apps such as video. With Cisco predicting that there will be more than 50 billion connected devices in the world by 2020 (embed live counter from this page), how can we ensure that the global connectivity infrastructure can cope under this strain?

What has been discussed less is the increased unpredictability in network needs. This is not simply a question of network capacity, but of how we connect with our technologies as a species. Managing a reliable global utility requires a continual dedication to configuration data and infrastructure management. Without that, trust in the utility is not possible.

The Internet of Things will present similar challenges for all who are involved in creating it, as the reliability of physical objects, vehicles and even human wearables and cybernetics takes on a whole new meaning. When a computer crashes, you can reboot. What happens when a network outage impacts personal belongings, vehicles or body parts? Thats why the Internet must be built on a solid network, because any faults could have consequences far more serious than any we have yet imagined.

From a security point of view, the key challenge is controlling access to billions of devices on the network. How do you protect the devices from attacks when at the same time you need to ensure that the devices have configuration enabled? How do you control access with devices such as pace makers or vehicle steering systems that cannot be switched off?

You also need to ensure that you have the right building blocks, including configuration data and infrastructure management, in place. You need robust platforms to ensure that the network is protected against the unexpected. Connectivity involves context and automation and with data there are often ripples when something goes wrong. For example, if 100,000 devices kick off an action, this can quickly impact the whole network due to the huge number of connected devices. Policing the interaction between the devices is key to ensuring that the whole network is protected.

Good security isnt cheap, and you must also make security controls flexible enough so connectivity retains agility. Any issues with software bugs or network mal-performance will be amplified with the Internet of things. Its up to service providers to deliver these mechanisms and policies, and the industry needs to recognize it.

All this leads us to some interesting new questions questions that shift our attention from supporting the network for the networks sake to supporting the network for humans sake. How do you balance the evolution of the Internet with the evolution of how people use technology?

Its true that technology now evolves faster than human behavior. And yet, human imagination is an infinite resource, whereas network capacity is not. This two-way tension raises some interesting questions:

More here:
Internet and Human Capability: A Study In Parallel Evolution

Posted in Post Human | Comments Off on Internet and Human Capability: A Study In Parallel Evolution

Professor Outlines Risks, Benefits of Genome Editing

Posted: October 21, 2014 at 1:44 am

Harvard Medical School professor George M. Church discussed the possibilities and potential dangers of genetic engineering on Wednesday. The lecture event, presented by the Harvard Museum of Natural History, covered a range of topics, including potential gains for genetic information and technologies and considerations of ethics and efficacy.

Church began the evening by highlighting the importance of genome testing, stressing that whether or not you have family history, whether or not you [are of] a particular ethnicity, all of us are at risk for rare diseases.

Genome testing has made advances in recent years, with the cost of sequencing an individuals genome having decreased in the past decade.But further advances in genome testing, Church said, could allow us to essentially see whats currently invisible, to essentially see the genomes around us.

Advances in the portability and affordability of genome testing, for instance, could lead to a sort of handheld DNA sequencing device that could dramatically impact diagnostics and field studies.

Moreover, Church said, if you have an inexpensive way of [sequencing genomes] you can really start testing a lot of ideas about cause and effect, with the potential to identify rare protective gene variants that could alleviate or eliminate some diseases.

Your genetics is not your destiny, Church said.

Church also discussed the possibility of de-extinction, bringing back species like the woolly mammoth. He predicted that the de-extinction process would largely depend on both ecological and economic considerations, in which species are judged both on their viability in modern ecosystems and their utility. He highlighted the woolly mammoth as an example of such a keystone species that could dramatically and positively impact the global ecosystem, citing his 2013 Scientific American article which outlined how mammoths could contribute to the reversal of global warming by keeping the tundra frozen.

Letting the tundra melt, Church said, is the equivalent to burning all of the forests in all of the world and their roots two and a half times over. Bringing back the woolly mammoth could be one important step toward preventing this catastrophic release of carbon, according to Church.

Church also briefly touched on human genetic enhancements, noting that changes in the modern environment and human behavior have framed the topic of altering ones genome in terms of necessity.

Our ancestors didnt need any genetic enhancements to be able to sit for twelve hours a day and eat fatty, sugary foods, but we need enhancements that handle that altered environment, he said. If we go into space, we need enhancements that handle radiation and osteoporosis…or else were dead. So what seems like an enhancement in one generation becomes life and death in another generation.

Continue reading here:
Professor Outlines Risks, Benefits of Genome Editing

Posted in Genome | Comments Off on Professor Outlines Risks, Benefits of Genome Editing

Versatility of WaferGen Bio-systems Next-Gen Sequencing (NGS) Sample Prep Solutions to Be Showcased at the American …

Posted: October 17, 2014 at 2:48 pm

FREMONT, Calif., Oct. 17, 2014 /PRNewswire/ –WaferGen Bio-systems, Inc. (Nasdaq: WGBS) announced today that three papers describing the utility and versatility of its SmartChip and Apollo 324 technologies will be presented at the American Society of Human Genetics (ASHG) 2014 meeting.

WaferGen has collaborated with notable academic and commercial clinical reference CLIA labs and will be highlighted in several scientific papers at ASHG. Collaborators include Prof. Yusuke Nakamura at the University of Chicago and Dr. Yuriy Shevchenko from GeneDx. The following papers will be presented:

“We are excited to continue expanding our list of customers and collaborators for WaferGen’s Next-Generation Sequencing (NGS) sample prep products. At ASHG, we will showcase a full suite of NGS solutions designed to streamline and accelerate clinical research and routine patient testing. WaferGen will introduce an expanded portfolio of protocols on the Apollo 324, a compact and flexible NGS library prep system that enables rapid and full walk-away automation of a wide variety of NGS and molecular diagnostics applications. The Apollo 324 is ideal for clinical whole genome and whole exome sequencing. By combining Apollo 324 with WaferGen’s Seq-ReadyTE System, an innovative one-step target enrichment and library preparation solution ideal for custom gene panels, WaferGen addresses the critical need of clinical labs for easy-to-use and cost effective workflows,” said Ivan Trifunovich, President and Chief Executive Officer of WaferGen.

The American Society of Human Genetics 64th Annual Meeting will take place on Saturday, October 18 to Wednesday, October 22 at the San Diego Convention Center. WaferGen will be exhibiting their NGS solutions in Booth #222. For more information and registration, please see http://www.ashg.org/2014meeting/.

About WaferGen

WaferGen Bio-systems, Inc. is a life science company that offers innovative genomic solutions for clinical testing and research. The SmartChip MyDesign Real-Time PCR System is a high-throughput genetic analysis platform for profiling and validating molecular biomarkers via microRNA and mRNA gene expression profiling, as well as single nucleotide polymorphism (SNP) genotyping. The SmartChip TE System is a novel product offering for target enrichment geared towards clinical Next-Gen sequencing (NGS). The Seq-Ready TE System, powered by SmartChip massively-parallel single-plex PCR technology, is an innovative one-step target enrichment and library preparation solution. The Company now also offers the Apollo 324 product line used in library preparation for NGS. These three complementary technologies offer a powerful set of tools enabling more accurate, faster and cheaper genetic analysis based on Next-Gen Sequencing and Real-Time PCR.

For additional information, please see http://www.wafergen.com

Investor Contacts: ICR, Inc. Bob Yedid bob.yedid@icrinc.com 646-277-1250

WaferGen Bio-systems, Inc. Ivan Trifunovich ivan.trifunovich@wafergen.com (510) 651-4450

SOURCE WaferGen Bio-systems, Inc.

See original here:
Versatility of WaferGen Bio-systems Next-Gen Sequencing (NGS) Sample Prep Solutions to Be Showcased at the American …

Posted in Human Genetics | Comments Off on Versatility of WaferGen Bio-systems Next-Gen Sequencing (NGS) Sample Prep Solutions to Be Showcased at the American …