Tag Archives: search-engines

Are there enough robots? – Robotics Tomorrow (press release)

Posted: February 15, 2017 at 12:18 am

Future trends indicate that there will be a concentration on the collaboration of human and machine, simplified applications, and light-weight robots. We will also see an increased focus on modular robots and robotic systems, which will be marketed at exceptionally alluring prices.

Len Calderone for | RoboticsTomorrow

It is anticipated that our economy will need to generate about a million jobs a year just to keep up with future growth. Because of the digital revolution, many new jobs have been created, but they are not labor intensive. This is where robots come into play. As the economy expands, we will need both humans for the mental tasks, and robots to handle the tedious and dangerous work.

Automation is extending beyond factories and distribution centers. White collar jobs are starting to be replaced by artificial intelligence. Artificial intelligence has already replaced various human jobs in music, journalism, teaching, research and other typical human careers. Attorneys are replacing paralegals with search engines, which are more efficient in finding topics than any human. Medical devices are assisting doctors in analyzing a patients symptoms with suggested solutions.

There will be a time when robots will make our goods and handle the services to support those goods. China is already aware that they do not have enough robots in the workforce. China is now the fastest growing and largest robotics market in the world, due mainly to an aging population, something that the U.S. is also facing. By next year, the robot population in China will explode. A third of all robots manufactured will be sold in China.

We are going through an industrial revolution, and it is accelerating. In the next few years, around 1.4 million industrial robots will be entering service in factories around the world. In the high-revenue automotive sector, global investments in industrial robots increased by a record-breaking 43 percent in just one year. The international market value for robotic systems is now about $32 billion. In the race for automation in manufacturing, the European Union is currently one of the global frontrunners with 65 percent of the EU countries having an above-average number of industrial robots per 10,000 employees. Still, the strongest growth for robots will be found in China with 40 percent of the worldwide market volume of industrial robots will be sold there alone in 2019. (World Robot Statistics, issued by the International Federation of Robotics).

There doesnt seem to be a shortage of industrial robots, as the number of robots deployed worldwide will increase to around 2.6 million units by 2019. 70 percent of the industrial robots are presently working in the automotive, electronics, metal and machinery industries.

At present, the U.S. is the fourth largest single market for industrial robots in the world. Within the U.S., Canada and Mexico, newly installed industrial robots rose by 17 percent. The U.S. accounts for three-quarters of all units sold at a 5 percent growth. The demand in Canada rose by 49 percent, while Mexico grew by 119 percent. If the economic situation can last, we might see an average annual growth of 5 to 10 percent in sales of robots from 2017 to 2019. Right now, NAFTA is on an unsteady course, so these figures might change.

HIT Robot Group, a Chinese company associated with the Harbin Institute of Technology, created an automated production line for lithium ion batteries that appears to be one giant robot. Robotic vehicles carry components between several manufacturing machines. The only place where you can find humans is inside a control room in the center. HIT estimates the new factory could reduce human labor by as much as 85 percent while manufacturing 150,000 batteries a day.

Patents for robotics and autonomous systems have jumped double-digit year-on-year for the last three years. According to a report published by the UK Intellectual Property Office, the number of global published patents for these technologies increased 9 percent of all of the global patents with Japan, Germany and the U.S. accounting for most of the patents.

In North America, robot orders were up 10 percent in 2016 compared to 2015, according to the Robotics Industry Association. 34,606 robots were ordered in North America with a total market value of $1.9 billion. For the fourth quarter, robot orders hit 10,621 valued at $561 million, up 21 percent from a year ago, which is a good indicator for 2017. The global industrial robotics market is expected to reach $79.58 billion by 2022, growing at a Compound Annual Growth Rate (CAGR) of 11.92% between 2016 and 2022.

The food and consumer goods industries ordered 32% more robots in 2016 than the previous year with food safety inspections, packaging, picking, handling and palletizing being among the highest applications for these robots.

Capping its most successful year in history, the robotics, vision and motion control industries are preparing to come together for Automate 2017, North Americas largest automation exhibition and conference April 3-6 in Chicago. Over 300 exhibitors and a record 20,000 attendees are expected to gather.

How will the robot manufacturers keep up? Venture capital investments in robotics technology start-ups are on the rise. Capital investments by U.S. venture capital firms escalated to about $172 million. This increase in investments is an especially meaningful signal that the robotics industry could see an accelerated growth as these VC-backed companies grow. It provides a window into the future as to what the investment community believes will be promising and profitable.

The robotic industry is booming in China, there are thousands of local robotic companies jumping into the market, manufacturing both industrial robots and service robots. China is not only a large supplier of low-wage workers, but also a source of high technology with robotics manufacturing being one of the hottest trends. The Robot Report and the research team at Robo-STOX have been able to identify 194 companies that make or are directly involved in making robots in China.

Future trends indicate that there will be a concentration on the collaboration of human and machine, simplified applications, and light-weight robots. We will also see an increased focus on modular robots and robotic systems, which will be marketed at exceptionally alluring prices.

The request for industrial robots will also be driven by an assortment of factors, which include the processing of new materials, energy efficiency, and improved automation concepts.

The one thing that is certain is that the manufacturers of robots are building an army of robots ready to step in and handle the tasks of the future.

This post does not have any comments. Be the first to leave a comment below.

You must be logged in before you can post a comment. Login now.

Great advances often start with small steps – in manufacturing cells measuring only 600 600 mm. Here, the KR 3 AGILUS is in its element. Particularly in the case of small parts and products which must be produced in a minimum of space. KUKA expertise, concentrated into the smallest of spaces, is setting new standards for the 3-kg class. The lightweight robot masters various tasks with agility, dynamism and maximum precision, leading to high flexibility in production – even when it comes to extremely narrow spaces.

Read the original here:

Are there enough robots? – Robotics Tomorrow (press release)

Posted in Robotics | Comments Off on Are there enough robots? – Robotics Tomorrow (press release)

Attention economy – Wikipedia

Posted: February 1, 2017 at 4:54 pm

Attention economics is an approach to the management of information that treats human attention as a scarce commodity, and applies economic theory to solve various information management problems. Put simply by Matthew Crawford, “Attention is a resourcea person has only so much of it.”[1]

In this perspective Thomas H. Davenport and J. C. Beck define the concept of attention as:

Attention is focused mental engagement on a particular item of information. Items come into our awareness, we attend to a particular item, and then we decide whether to act. (Davenport & Beck 2001, p.20)

As content has grown increasingly abundant and immediately available, attention becomes the limiting factor in the consumption of information.[2] A number of software applications either explicitly or implicitly take attention economy into consideration in their user interface design, based on the realization that if it takes the user too long to locate something, they will find it through another application. This is done, for instance, by creating filters to make sure the first content a viewer sees is relevant, of interest, or with the approval of demographics.[3] An attention-based advertising scheme may say they are measuring the number of “eyeballs” by which their content is seen.[4]

Herbert A. Simon was perhaps the first person to articulate the concept of attention economics when he wrote:

“…in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it” (Simon 1971, pp.4041).

He noted that many designers of information systems incorrectly represented their design problem as information scarcity rather than attention scarcity, and as a result they built systems that excelled at providing more and more information to people, when what was really needed were systems that excelled at filtering out unimportant or irrelevant information (Simon 1996, pp.143144).

In recent years, Simon’s characterization of the problem of information overload as an economic one has become more popular. Business strategists such as Thomas H. Davenport or Michael H. Goldhaber have adopted the term “attention economy” (Davenport & Beck 2001).

Some writers have even speculated that “attention transactions” will replace financial transactions as the focus of our economic system (Goldhaber 1997, Franck 1999). Information systems researchers have also adopted the idea, and are beginning to investigate mechanism designs which build on the idea of creating property rights in attention (see Applications).

According to digital culture expert Kevin Kelly, the modern attention economy is increasingly one where the consumer product costs nothing to reproduce and the problem facing the supplier of the product lies in adding valuable intangibles that cannot be reproduced at any cost. He identifies these intangibles as:[5]

Attention economy is also relevant to the social sphere. More specifically, long term attention can also be considered according to the attention that a person dedicates managing its interactions with others. Dedicating too much attention to these interactions can lead to “social interaction overload”, i.e. when people are overwhelmed in managing their relationships with others, for instance in the context of social network services in which people are the subject of a high level of social solicitations. Digital media and the internet facilitate participation in this economy, by creating new channels for distributing attention. Ordinary people are now empowered to reach a wide audience by publishing their own content and commenting on the content of others.[6]

Social attention can also be associated to collective attention, i.e. how “attention to novel items propagates and eventually fades among large populations.” (Wu & Huberman 2007)

“Attention economics” treats a potential consumer’s attention as a resource.[7] Traditional media advertisers followed a model that suggested consumers went through a linear process they called AIDA – Attention, Interest, Desire and Action. Attention is therefore a major and the first stage in the process of converting non-consumers. Since the cost to transmit advertising to consumers is now sufficiently low that more ads can be transmitted to a consumer (e.g. via online advertising) than the consumer can process, the consumer’s attention becomes the scarce resource to be allocated. Dolgin also states that a superfluidity of information may hinder the decision making of an individual who keeps searching and comparing products as long as it promises to provide more than it is using up.[8]

One application treats various forms of information (spam, advertising) as a form of pollution or ‘detrimental externality’. In economics an externality is a by-product of a production process that imposes burdens (or supplies benefits), to parties other than the intended consumer of a commodity. For example; air and water pollution are negative externalities which impose burdens on society and the environment.

A market-based approach to controlling externalities was outlined in Ronald Coase’s The Problem of Social Cost (Coase 1960). This evolved from an article on the Federal Communications Commission (Coase 1959), in which Coase claimed that radio frequency interference is a negative externality that could be controlled by the creation of property rights.

Coase’s approach to the management of externalities requires the careful specification of property rights and a set of rules for the initial allocation of the rights. Once this has been achieved, a market mechanism can theoretically manage the externality problem. The solution is not necessarily simple in its application to media content (Hay 1996).

Sending huge numbers of e-mail messages costs spammers very little, since the costs of e-mail messages are spread out over the internet service providers that distribute them (and the recipients who must spend attention dealing with them). Thus sending out as much spam as possible is a rational strategy: even if only 0.001% of recipients (1 in 100,000) is converted into a sale, a spam campaign can be profitable (Mangalindan 2002). Spammers are demanding valuable attention from potential customers, but they are avoiding paying a fair price for this attention due to the current architecture of e-mail systems.

One way this might be implemented is by charging senders a small fee per e-mail sent, often referred to as a “Sender Bond.” It might be close to free for an advertiser to send a single e-mail message to a single recipient, but sending that same e-mail to 1000 recipients would cost him 1000 times as much. A 2002 experiment with this kind of usage-based e-mail pricing found that it caused senders to spend more effort targeting their messages to recipients who would find them relevant, thus shifting the cost of deciding whether a given e-mail message is relevant from the recipient to the sender (Kraut 2002).

Closely related is the idea of selling “interrupt rights,” or small fees for the right to demand one’s attention (Fahlman 2002). The cost of these rights could vary according to the interruptee: interrupt rights for the CEO of a Fortune 500 company would presumably be extraordinarily expensive, while those of a high school student might be lower. Costs could also vary for an individual depending on context, perhaps rising during the busy holiday season and falling during the dog days of summer. Interruptees could decline to collect their fees from friends, family, and other welcome interrupters.

Another idea in this vein is the creation of “attention bonds,” small warranties that some information will not be a waste of the recipient’s time, placed into escrow at the time of sending (Loder, Van Alstyne & Wash 2004). Like the granters of interrupt rights, receivers could cash in their bonds to signal to the sender that a given communication was a waste of their time or elect not to cash them in to signal that more communication would be welcome.

Supporters of attention markets for controlling spam claim that their solutions are superior to the alternatives for managing uses of information systems on which there is no consensus on the question of whether it is pollution or not. For example, the use of e-mail or text messages for rallying political support or by non-profit charitable organizations may be considered spam by some users but legitimate use by others. Laws against spam put the power to make this decision in the hands of government, while technological solutions like filtering technologies put it in the hands of private companies or technologically savvy users. A market-based solution, on the other hand, allows the possibility of individual negotiation over the worth of a given message rather than a unilateral decision by a controlling party (Loder, Van Alstyne & Wash 2004, p.10). Such negotiation itself consumes attention and carries with it an attention cost, though.

As search engines have become the primary means for finding and accessing information on the web, high rankings in the results for certain queries have become valuable commodities, due to the ability of search engines to focus searchers’ attention. Like other information systems, web search is vulnerable to pollution: “Because the Web environment contains profit seeking ventures, attention getting strategies evolve in response to search engine algorithms” (Page 1998). It is estimated that successful exploitation of such strategies, known as web spam, is a potential $4.5 billion per year business (Singhal 2004, p.16).

Since most major search engines now rely on some form of PageRank (recursive counting of hyperlinks to a site) to determine search result rankings, a gray market in the creation and trading of hyperlinks has emerged. Participants in this market engage in a variety of practices known as link spamming, link farming, and reciprocal linking.

However, as opponents of the “nofollow” attribute point out, while this solution may make it incrementally easier for search engines to detect link spam, it does not appreciably change the incentive structure for link spammers unless 100% of existing systems are upgraded to support the standard: as long as some critical mass of spammable sites exists, link spam will continue. Furthermore, the “nofollow” attribute does nothing to combat link farming or reciprocal linking. There is also a philosophical question of whether the links of site commentators (as opposed to site owners) should be treated as “second-class,” leading to the claim that the attribute “heists commentators’ earned attention” (NoNoFollow.net 2005).

Another issue, similar to the issue discussed above of whether or not to consider political e-mail campaigns as spam, is what to do about politically motivated link campaigns or Google bombs (Tatum 2005). Currently the major search engines do not treat these as web spam, but this is a decision made unilaterally by private companies. There is no opportunity for negotiation over the question of what is an appropriate use of attention expressed through hyperlinking. It remains to be seen[vague] whether a market-based approach might provide more flexible handling of these gray areas.

The paid inclusion model, as well as more pervasive advertising networks like Yahoo! Publisher Network and Google’s AdSense, work by treating consumer attention as the property of the search engine (in the case of paid inclusion) or the publisher (in the case of advertising networks). This is somewhat different from the anti-spam uses of property rights in attention, which treat an individual’s attention as his or her own property.

Originally posted here:

Attention economy – Wikipedia

Posted in Resource Based Economy | Comments Off on Attention economy – Wikipedia

Ai dictionary definition | ai defined – YourDictionary

Posted: August 30, 2016 at 11:03 pm

(1) See Adobe Illustrator.

(2) (Artificial Intelligence) Devices and applications that exhibit human intelligence and behavior, including robots, driverless cars, medical diagnosis and expert systems. Voice and natural language recognition are major components. Artificial intelligence implies the capability to learn and adapt through experience, and today’s large organizations, search engines and social media sites are learning billions of details about the world’s content and human behavior every day. One result of this knowledge is the voice-activated, natural language assistant, such as Apple’s Siri, Google Now and Microsoft’s Cortana (see virtual assistant). See Big Data, autonomous vehicle and expert system. An Earlier Buzzword Decades ago, the AI buzzword was very much abused as it referred to any and all advancements. However, the acid test of AI was defined in the 1940s by English scientist, Alan Turing, who said, “A machine has artificial intelligence when there is no discernible difference between the conversation generated by the machine and that of an intelligent person” (see Turing test). Question and answer dialog is already here and will continue to get better; however, a “real” conversation like the androids in the movies could take a very long time. See social robot, computer generations, neural network, AI anxiety and Watson. Artificial means Human The term “intelligence” means processing capability; therefore, every computer is intelligent. However, artificial intelligence implies human-like intelligence. An ironic twist in terminology.

Shakey the Robot

Developed in 1969 by the Stanford Research Institute, Shakey was the first fully mobile robot with artificial intelligence. Seven feet tall, Shakey was named after its rather unstable movements. (Image courtesy of The Computer History Museum, http://www.computerhistory.org)

Forty-Four Years Later – Still a Bit Shaky

Funded by DARPA and made by Boston Dynamics, the 400-pound, 6’2″ Atlas was designed for emergency rescue. Built in 2013, Atlas stumbled a lot in its first tests; however, teams of AI engineers are teaching Atlas to become very sophisticated. (Image courtesy of Boston Dynamics, http://www.bostondynamics.com)

Read the rest here:

Ai dictionary definition | ai defined – YourDictionary

Posted in Ai | Comments Off on Ai dictionary definition | ai defined – YourDictionary

Consulting Services – Newmarket

Posted: February 7, 2016 at 1:43 am

Newmarket Consulting Services help customers realize the value and maximize the benefits of their Newmarket technology solutions. Newmarket business consultants use a proven methodology and framework to ensure that expected results and ROI are achieved, including benchmarks against industry best practices and standards.

Since 1985, Newmarket has worked closely with the world’s leading hospitality organizations, analyzing how to best do business in the ever-changing market. Over time, Newmarket has developed a set of best practices as the hospitality industry’s leading supplier of business services. Newmarket client statistics include:

Organizations should know their competitive position relative to industry best practices. Work with Newmarket to conduct a SCORE Assessment. This in-depth evaluation measures group sales and catering business processes to create an action plan for change. Increase efficiency and profitability – know the SCORE!

The Newmarket SCORE Assessment introduces a new way to analyze current organizational standards and procedures against industry leaders. From capturing account information and distributing BEOs to performance measurement and reporting, sales and service practices are compared to optimum industry processes.

Customers receive a comparative score, a set of recommendations, and an actionable change plan to implement needed improvements.

Learn more about the SCORE Assessment.

Newmarket offers low-cost Remote NSA Servicesfor the ongoing administration of Delphi. Remote NSA Services allow hotel sales professionals to delegate system tasks to an experienced administrator on the Newmarket services team. The on-staff, certified NSA manages and administrates Delphi using remote access tools.

Key Benefits to utilizing Remote NSA Services include:

Learn more aboutRemote NSA Services.

Room diagrams are a valuable tool, enhancing communication with clients by allowing them to envision events in a function space. Newmarket CAD Services add value by creating dimensionally accurate diagrams (2D and 3D) that are then deployed using tNewmarket diagrams solution.

Diagrams WebView is an interactive website tool designed specifically for hospitality to better showcase property features to clients and prospects by using an interactive, dynamic rendering of the venue, as well as improving search engine optimization (SEO). With Diagrams WebView, clients and prospects navigate the property layout, meeting room floor plans, configurations, and capacities.

Newmarket understands the data management challenges that arise during times of change, including system upgrades, new implementations, mergers, and changes in ownership. In response, the experienced Data Services team can assist by seamlessly navigating change during many different circumstances, including:

Today, more than half of new business in group sales for hotels, conference centers, and other event venues is generated via Internet. Hospitality organizations must have a clear strategy in order to capture business from multiple online channels, including website, search engines, social networks, and third party lead sources. With an Internet Presence Evaluation, Newmarket helps customers improve their online presence to ensure they are maximizing their reach and connectivity while capturing valuable, targeted online leads.

Continue reading here:
Consulting Services – Newmarket

Posted in NSA | Comments Off on Consulting Services – Newmarket

Human Traffickers Caught on Hidden Internet

Posted: March 10, 2015 at 3:41 am

A new set of search tools called Memex, developed by DARPA, peers into the deep Web to reveal illegal activity

Hidden in Plain Sight: Investigators are using DARPA’s Memex technology pull information from the so-called “deep Web” that can be used to find and prosecute human traffickers. Courtesy of PhotoDisc/ Getty Image.

In November 2012 a 28-year-old woman plunged 15 meters from a bedroom window to the pavement in New York City, a devastating fall that left her body broken but alive. The accident was an act of both desperation and hopethe woman had climbed out of the sixth-floor window to escape a group of men who had been sexually abusing her and holding her captive for two days. Four months ago the New York County District Attorneys Office sent Benjamin Gaston, one of the men responsible for the womans ordeal, to prison for 50-years-to-life. A key weapon in the prosecutors arsenal, according to the NYDAs Office: an experimental set of Internet search tools the U.S. Department of Defense is developing to help catch and lock up human traffickers. Although the Defense Department and the prosecutors office had not publicly acknowledged using the new tools, they confirmed to Scientific American that the Defense Advanced Research Projects Agencys (DARPA) Memex program provided advanced Internet search capabilities that helped secure the conviction. DARPA is creating Memex to scour the Internet in search of information about human trafficking, in particular advertisements used to lure victims into servitude and to promote their sexual exploitation. Much of this information is publically available, but it exists in the 90 percent of the so-called deep Web that Google, Yahoo and other popular search engines do not index. That leaves untouched a multitude of information that may not be valuable to the average Web surfer but could provide crucial information to investigators. Google would not confirm that it indexes no more than 10 percent of the Internet, a statistic that has been widely reported, but a spokesperson pointed out that the companys focus is on whether its search results are relevant and useful in answering users’ queries, not whether it has indexed 100 percent of the data on the Internet. Much of this deep Web information is unstructured data gathered from sensors and other devices that may not reside in a database that can be scanned or crawled by search engines. Other deep Web data comes from temporary pages (such as advertisements for illegal sexual and similarly illicit services) that are removed before search engines can crawl them. Some areas of the deep Web are accessible using only special software such as the Tor Onion Router, which allows people to secretly share information anonymously via peer-to-peer connections rather than going through a centralized computer server. DARPA is working with 17 different teams of researchersfrom both companies and universitiesto craft Internet search tools as part of the Memex program that give government, military and businesses new ways to analyze, organize and interact with data pulled from this larger pool of sources. Law and order DARPA has said very little about Memex and its use by law enforcement and prosecutors to investigate suspected criminals. According to published reports, including one from Carnegie Mellon University, the NYDAs Office is one of several law enforcement agencies that have used early versions of Memex software over the past year to find and prosecute human traffickers, who coerce or abduct peopletypically women and childrenfor the purposes of exploitation, sexual or otherwise. Memexa combination of the words memory and index first coined in a 1945 article for The Atlanticcurrently includes eight open-source, browser-based search, analysis and data-visualization programs as well as back-end server software that perform complex computations and data analysis. Such capabilities could become a crucial component of fighting human trafficking, a crime with low conviction rates, primarily because of strategies that traffickers use to disguise their victims identities (pdf). The United Nations Office on Drugs and Crime estimates there are about 2.5 million human trafficking victims worldwide at any given time, yet putting the criminals who press them into service behind bars is difficult. In its 2014 study on human trafficking (pdf) the U.N. agency found that 40 percent of countries surveyed reported less than 10 convictions per year between 2010 and 2012. About 15 percent of the 128 countries covered in the report did not record any convictions. Evidence of criminals peddling such services online is hard to pinpoint because of the use of temporary ads and peer-to-peer connections within the deep Web. Over a two-year time frame traffickers spent about $250 million to post more than 60 million advertisements, according to DARPA-funded research. Such a large volume of Web pages, many of which are not posted long enough to be crawled by search engines, makes it difficult for investigators to connect the dots. This is, in part, because investigators typically search for evidence of human trafficking using the same search engines that most people use to find restaurant reviews and gift ideas. Hence the Memex project. Inside Memex At DARPAs Arlington, Va., headquarters Memex program manager Christopher White provided Scientific American with a demonstration of some of the tools he and his colleagues are developing. Criminal investigations often begin with little more than a single piece of information, such as an e-mail address. White plugged a demo address into Google to show how investigators currently work. As expected, he received a page of links from the portion of the Internet that Google crawlsalso referred to as the surface Webprioritized by a Google algorithm attempting to deliver the most relevant information at the top. After clicking through several of these links, an investigator might find a phone number associated with the e-mail address. Thus far, White had pulled the same information from the Internet that most people would see. But he then faced a next step all Web users confront: sifting through pages of hyperlinks with very little analytical information available to tie together different search results. Just as important as Memexs ability to pull information from a broader swath of the Internet are its tools that can identify relationships among different pieces of data. This helps investigators create data maps used to detect spatial and temporal patterns. One example could be a hub-and-spoke visualization depicting hundreds of Web sites connected to a single sex services e-mail, phone number or worker. > > Scientific American exclusive: A sneak peek at Memex data maps

White also showed how MEMEX can generate color-coded heat maps of different countries that locate where the most sex advertisements are being posted online at any given time. These patterns and others could help reveal associations that investigators might otherwise miss, says White, who began working with DARPA in 2010 as a consultant developing data-science tools to support the U.S. military in Afghanistan. Search results The technology has already delivered results since DARPA began introducing Memex to select law enforcement agencies about a year ago. The NYDA says that its new Human Trafficking Response Unit now uses DARPAs Memex search tool in every human trafficking case it pursues. Memex has played a role in generating at least 20 active sex trafficking investigations and has been applied to eight open indictments in addition to the Gaston conviction, according to the NYDAs Office. Memex helps us build evidence-based prosecutions, which are essential to fighting human trafficking, says Manhattan District Attorney Cyrus R. Vance, Jr. In these complex cases prosecutors cannot rely on traumatized victims alone to testify. We need evidence to corroborate or, in some cases, replace the need for the victim to testify. Different components of Memex are helping law enforcement crack down on trafficking elsewhere in the country as well. A detective in Modesto, Calif., used a specific piece of software called Traffic Jam to follow up on a tip about one particular victim from Nebraska and ended up identifying a sex trafficker who was traveling with prostitutes across the Midwest and West. The investigation culminated in his arrest. Traffic Jam, developed independently of DARPA in 2011 by Carnegie Mellon University researchers and later spun off into a company called Marinus Analytics, enabled investigators to gather evidence by quickly reviewing ads the trafficker posted for several locales. DARPA has since awarded Carnegie Mellon a three-year, $3.6-million contract to enhance Traffic Jams basic search capabilities as part of Memex, with machine-learning algorithms that can analyze results in depth, according to the university. Carnegie Mellon researchers are also studying ways to apply computer vision to searches in a way that helps investigators identify images with similar elementssuch as furniture from the same hotel room that appears in multiple imageseven if the images themselves are not identical, says Jeff Schneider. Schneider is the project’s principal investigator and a research professor in the Auton Lab at the universitys School of Computer Science, which studies statistical data mining. Furniture in a hotel room, for example, could help law enforcement identify the location of trafficking operations. Vance and other law enforcement officials welcome such advances. Technology alone wont solve cases, but it certainly helps, he says. Weve had the most success with this effort when we married traditional field intelligence with the information this tool provides. White agrees that DARPAs technology is a supplement to other investigative methods, including interviews with victims. In addition to targeting human trafficking, law enforcement officials are finding that they can tap Memex to crack down on other, related crimes, including trafficking in guns and drugs

Read more here:
Human Traffickers Caught on Hidden Internet

Posted in Post Human | Comments Off on Human Traffickers Caught on Hidden Internet

Longevity Science: Unraveling the Secrets of Human Longevity …

Posted: December 20, 2013 at 4:44 pm

The purpose of our studies: to understand the mechanisms of aging and longevity in order to extend healthy and productive human lifespan. This scientific and educational website contains over a hundred of scientific and reference documents relevant to longevity and aging studies. It is receiving about 1000 visits per day from many prestigious organizations including the US Library of Congress, the US National Institutes of Health (NIH), the US Centers for Disease Control (CDC), and from the Royal Society – the UK National Academy of Science. This website is rated as the top # 1 website on longevity science topic in such major search engines as Google, Yahoo!, Alltheweb, etc. (when searching for longevity science term).Breaking News:

Table of Contents:

Dr. Natalia S. Gavrilova Center on Aging NORC at theUniversity of Chicago 1155 East 60th Street Chicago, IL 60637-2745 Fax: (773) 256-6313 Phone: (773) 702-1375 E-mail: Brief Biographical Sketch, NIH Biosketch Detailed Curriculum Vitae Resume Expertise Profile Statement of Research Interests

We also maintain close scientific contacts with Dr. Bruce A. Carnes at the University of Oklahoma. Dr. Yulia Kushnareva at Burnham Institute, La Jolla, CA

What we have found and published:

Available at:

THE RELIABILITY THEORY OF AGING AND LONGEVITY Journal of Theoretical Biology, 2001, 213(4): 527-545. Abstract To download full text click here For Press Release click here For Media Coverage click here

Read the rest here:
Longevity Science: Unraveling the Secrets of Human Longevity …

Posted in Human Longevity | Comments Off on Longevity Science: Unraveling the Secrets of Human Longevity …

Censorship – Wikipedia, the free encyclopedia

Posted: at 4:42 pm

Censorship is the suppression of speech or other public communication which may be considered objectionable, harmful, sensitive, politically incorrect or inconvenient as determined by a government, media outlet or other controlling body. It can be done by governments and private organizations or by individuals who engage in self-censorship. It occurs in a variety of different contexts including speech, books, music, films, and other arts, the press, radio, television, and the Internet for a variety of reasons including national security, to control obscenity, child pornography, and hate speech, to protect children, to promote or restrict political or religious views, and to prevent slander and libel. It may or may not be legal. Many countries provide strong protections against censorship by law, but none of these protections are absolute and it is frequently necessary to balance conflicting rights in order to determine what can and cannot be censored.

Socrates defied censorship and was sentenced to drink poison in 399 BC for promoting his philosophies. Plato is said to have advocated censorship in his essay on The Republic. The playwright Euripides (480406 BC) defended the true liberty of freeborn men, the right to speak freely,.[2]

The rationale for censorship is different for various types of information censored:

Strict censorship existed in the Eastern Bloc.[9] Throughout the bloc, the various ministries of culture held a tight rein on their writers.[10] Cultural products there reflected the propaganda needs of the state.[10] Party-approved censors exercised strict control in the early years.[11] In the Stalinist period, even the weather forecasts were changed if they had the temerity to suggest that the sun might not shine on May Day.[11] Under Nicolae Ceauescu in Romania, weather reports were doctored so that the temperatures were not seen to rise above or fall below the levels which dictated that work must stop.[11]

Independent journalism did not exist in the Soviet Union until Mikhail Gorbachev became its leader; all reporting was directed by the Communist Party or related organizations. Pravda, the predominant newspaper in the Soviet Union, had a monopoly. Foreign newspapers were available only if they were published by Communist Parties sympathetic to the Soviet Union.

Possession and use of copying machines was tightly controlled in order to hinder production and distribution of samizdat, illegal self-published books and magazines. Possession of even a single samizdat manuscript such as a book by Andrei Sinyavsky was a serious crime which might involve a visit from the KGB. Another outlet for works which did not find favor with the authorities was publishing abroad.

The People’s Republic of China employs sophisticated censorship mechanisms, referred to as the Golden Shield Project, to monitor the internet. Popular search engines such as Baidu also remove politically sensitive search results.[12]

Iraq under Baathist Saddam Hussein had much the same techniques of press censorship as did Romania under Nicolae Ceauescu but with greater potential violence.[citation needed]

Cuban media is operated under the supervision of the Communist Party’s Department of Revolutionary Orientation, which “develops and coordinates propaganda strategies”.[13] Connection to the Internet is restricted and censored.[14]

Censorship also takes place in capitalist nations, such as Uruguay. In 1973, a military coup took power in Uruguay, and the State practiced censorship. For example, writer Eduardo Galeano was imprisoned and later was forced to flee. His book Open Veins of Latin America was banned by the right-wing military government, not only in Uruguay, but also in Chile and Argentina.[15]

View post:
Censorship – Wikipedia, the free encyclopedia

Posted in Censorship | Comments Off on Censorship – Wikipedia, the free encyclopedia

Google Cites Censorship Risk in EU Data Control Lawsuit

Posted: February 26, 2013 at 10:46 pm

Google Inc. (GOOG) shouldnt have to remove content from its search engine that was lawfully published elsewhere, the company argued in a case at the European Unions top court that will set boundaries between freedom of expression and data-protection rights.

The operator of the worlds largest search engine isnt a data controller, it is a mere intermediary in terms of the data which it indexes, Google lawyer Francisco Enrique Gonzalez-Diaz told a panel of 15 judges at the EU Court of Justice hearing today. Direct requests for personal information to be removed from a search engine — even if it was put online by a newspaper — would be a fundamental shift of responsibility from the publisher to the search engine and would amount to censorship.

The dispute raises questions about the scope of EU privacy rules when it comes to personal data on the Internet; the rights of search engines to use any online data to remain commercially successful; and who ultimately is in charge of what happens with the data. The Luxembourg-based courts ruling will be binding on courts across the 27-nation bloc.

The case was triggered by about 200 instances of Spains data-protection authority ordering Google to remove information on people. The information in todays case concerned a Spanish man whose house was auctioned off for failing to pay taxes. Newspaper La Vanguardia published the information in 1998 and years later it could still be found via a Google search.

In this case and in many other cases, serious harm is done to individuals, Joaquin Munoz Rodriguez, a lawyer representing the man, told the EU court. The information is tracked and ordered by Google and contains, to a very large extent, personal data.

Google is liable because it allows easy and quick access to information that wasnt easily found online before, he said.

Google faces privacy investigations around the world as it adds services and steps up competition with Facebook Inc. (FB) for users and advertisers. The Mountain View, California-based company created a uniform set of privacy policies last year for more than 60 products, unleashing criticism from regulators and consumer advocates over whether it was properly protecting data.

People shouldnt be prevented from learning that a politician was convicted of taking a bribe, or that a doctor was convicted of malpractice, Google said in a blog post. The substantive question before the court today is whether search engines should be obliged to remove links to valid legal material.

Data protection is currently policed by separate regulators across the EU. The blocs executive body wants to simplify the system so companies deal with only one.

A lawyer for the European Commission, the EUs executive, argued today that Google does control data. That view diverges from an opinion of a group representing the blocs data- protection watchdogs, which said search engines are generally not to be held primarily responsible for content.

Read the rest here:
Google Cites Censorship Risk in EU Data Control Lawsuit

Posted in Censorship | Comments Off on Google Cites Censorship Risk in EU Data Control Lawsuit

Google Closes Shopping Search Engine In China – Video

Posted: December 15, 2012 at 12:42 am



Google Closes Shopping Search Engine In China
Search engine giant Google Inc. has further reduced its stake in the Chinese market. The company announced on Tuesday its e-commerce search engine is closing its China branch. Google says it's because the service didn't have the kind of impact it had hoped for. Google has seen steep competition from Chinese search engines and government censorship of its online sites. After it pulled out of China in 2010 over a censorship dispute, Google's search engine traffic in China shrunk. According to Analysts International, Google went from holding over a third that traffic in 2009, to just 6% this September. This week's pullout follows an earlier product withdrawal in September. That's when Google closed its music downloading service in China, again over low traffic. Google says it will continue to offer online advertising for Chinese businesses but will now charge companies for listing in its shopping search engine. For more news and videos visit ☛ ntd.tv Follow us on Twitter ☛ http Add us on Facebook ☛ on.fb.meFrom:NTDonChinaViews:3 0ratingsTime:00:55More inNews Politics

Excerpt from:
Google Closes Shopping Search Engine In China – Video

Posted in Censorship | Comments Off on Google Closes Shopping Search Engine In China – Video

Gerald Celente – Corruption and Economic Crimes – Video

Posted: November 3, 2012 at 6:44 pm



Gerald Celente – Corruption and Economic Crimes
Gerald Celente – Corruption and Economic Crimes RT Channel – http://www.youtube.com We support free speech, free press, free markets and human rights. Planet 2.0 Global News Shop : p2gn.spreadshirt.com Planet 2.0 Global News RSS Feed : feeds.feedburner.com Fair Use Notice : Fair use is a limitation and exception to the exclusive right granted by copyright law to the author of a creative work. In United States copyright law, fair use is a doctrine that permits limited use of copyrighted material without acquiring permission from the rights holders. Examples of fair use include commentary, search engines, criticism, news reporting, research, teaching, library archiving and scholarship. It provides for the legal, unlicensed citation or incorporation of copyrighted material in another author's work under a four-factor balancing test.From:PlanetGlobalNewsViews:1 0ratingsTime:03:15More inNews Politics

Read the original:
Gerald Celente – Corruption and Economic Crimes – Video

Posted in Free Speech | Comments Off on Gerald Celente – Corruption and Economic Crimes – Video