Documentos de Académico
Documentos de Profesional
Documentos de Cultura
POWER
Page 4 | Could Google tilt a close election? Page 7 | Making Robots Mimic the Human Hand Page 9 | Researchers Find Surprising Similarities Between Genetic and Computer Codes Page 11 | Mobile APP Turning iPhone into a biologically-inspired hearing aid Page 13 | RIT researchers develop advanced video and image processing Page 14 | Holograms Add New Dimension to Fighting Fire Page 16 | New clues to Wikipedia's shared super mind Page 17 | NSF Official On New Supers, Data-Intensive Future Page 21 | Google Australia funds universities to spruik computer science Page 23 | Crowd-funding is working for Open Source projects
It certainly is clear that outside groups seek to manipulate Googles results. The consequences of such tactics in the consumer world are well-known, with companies spending vast sums trying to goose search rankings for their products in make-or-break bids for profit. In the political realm, the creators of Google bombs managed to link the name of then-Sen. John Kerry, the Democratic presidential nominee in 2004, with the word waffles in search results. President George W. Bush had his name linked, through similar tactics, to the words miserable failure. In 2010, a conservative group used a collection of linked Twitter accounts to affect search rankings about the Massachusetts special election that brought Scott Brown to the Senate, according to research by two computer science professors at Wellesley College. Google has resisted such tactics, and its vulnerability to manipulation from outside was limited in the 2012 election cycle, according to researchers, political professionals and search experts. Though search results on Google are generated by a complex and ever-changing algorithm weighing, for example, links to other sites, content quality and the time spent on sites when people click through the key factors emphasize relevance to users. The company works to spot and defeat those who seek to alter results unfairly, and it sometimes punishes those who do by demoting their search rankings. But Epsteins argument is based on a different scenario: What if manipulation came from within? Even those who harbour no doubts about Googles intentions generally agree that internal manipulation would be potent and, at least initially, hard to spot. They could do something manually with these results, but I cant see why they would do that, said Mike Grehan, publisher of Search Engine
Watch and a commentator whose views often are in line with Googles. Yet Epstein and some others say the companys power alone whether or not it uses it calls out for legal safeguards. Though Microsoft, Yahoo and Facebook also operate search engines, Google has about two-thirds of the U.S. market. Even if Google has no plan to skew search rankings today, what if conditions or its corporate leadership changed over time? There is a bit of history of some powerful communications companies directly meddling in elections. I dont think Google has an incentive to do this, but a future Google could, said Tim Wu, a Columbia University law professor and the author of The Master Switch: The Rise and Fall of Information Empires. The question of free speech in America is controlled by a few powerful gatekeepers who could subtly shape things. In the 1800s, Wu noted, Western Union employees often read telegrams from Democrats and shared their contents with Republicans their political allies or didnt deliver them. This stopped, Wu said, only with the arrival of forceful federal regulation. Epstein, a Harvard-trained psychologist and former editor in chief of Psychology Today, turned his attention to Google after the company flagged search results for a Web site that he ran, warning that it was infected with malicious programs that could harm visitors. Epstein complained publicly about the move and the lack of responsiveness from Google, e-mailing senior company officials. He later acknowledged that his site had been infiltrated by hackers, but the experience left him aghast at what he considered Googles unchecked power. He wrote blog posts calling for greater regulatory oversight of the company.
For his experiment, conducted with colleague Ronald E. Robertson at the American Institute for Behavioural Research and Technology, Epstein attempted to shape the perceptions of a random sampling of potential voters in California. The test involved an election most of the subjects knew little about: a close-fought campaign for prime minister of Australia in 2010. The researchers secretly altered the rankings of search results to help favoured candidates. After 15 minutes of searching and reading linked articles, it was clear that the manipulation had worked, with about 65 percent of subjects favouring the candidate getting elevated rankings, compared with 50 percent among a control group that saw impartial search results, according to Epstein. Three out of four subjects, meanwhile, reported no awareness that the search rankings had been altered. The lack of prior knowledge about the race or alternative sources of information accentuated the effects of the search rankings, Epstein acknowledged. But he said the experiment made clear that manipulation is possible, powerful and hard to detect. However, the sheer volume of other information available to voters would make such manipulation hard to execute, said David Vladeck, a Georgetown University law professor and the former head of consumer protection at the Federal Trade Commission. Traditional news organizations, he said, probably have more power over the views of voters. It is not clear to me that, even if Google tried to, it could exercise the same power over the American public as Fox News or MSNBC, Vladeck said. The claim is such a difficult one to sustain that I find it hard to take it seriously.
Federal regulations have in some circumstances limited what news organizations can do. The Fairness Doctrineonce required broadcasters to present both sides of controversial issues, and media crossownership rules can still limit the ability of newspapers, for example, to own radio or television stations in the same metropolitan area. Some legal scholars contend that search engine rankings are covered under the First Amendments free speech protections. Yet, even those who think that search engines can have potent effects on elections differ on what kind of regulation, if any, would be sensible and effective. And its not even clear what federal agency would have the authority to investigate allegations of abuse. The key lesson may be that search engines are not mere machines spitting out perfectly impartial results. They are driven by decisions, made by people who have biases. This does not necessarily make them evil merely human. The more trust we give to these kinds of tools, the more likely we can be manipulated down the road, said Panagiotis T. Metaxas, one of the computer science professors at Wellesley College who studied the Massachusetts election. We need to understand, as people, as citizens, why we believe what we believe.
Craig Timberg
timbergc@washpost.com
References
[1] The Washington Post|
http://www.washingtonpost.com/opinions/couldgoogle-tilt-a-close-election/2013/03/29/c8d7f4e69587-11e2-b6f0-a5150a247b6a_story.html
The agency began with six teams and held a bake-off in which it chose three teams to continue in the last phase of the project. In the software project, Darpa supplied each team with a standard hand that it then programmed. The grasping tasks were done so well that we believe that for the kinds of objects we had them pick up ranging from a ball to a rock to tools like hammers we dont need to do further work in grasping, Dr. Pratt said. Manipulating grasped objects was a more challenging task, he said, and one on which the teams would continue to do research. The program is financed for 18 more months. Darpa is also continuing to finance the development of low-cost arms at Barrett Technologies, a robotics research firm in Cambridge, Mass.; Sandia; iRobot; and SRI International, a research organization in Menlo Park, Calif. The agency is also planning to create a joint project to transfer some of the lowcost technology advances it has made in the project into a related effort to develop prosthetic limbs for wounded soldiers. Johns Hopkins University has received funds to develop a neural interface a direct link from a robot arm to the human brain and DEKA Research, an independent development laboratory headed by Dean Kamen in Manchester, N.H., has developed a separate wearable arm now being considered for approval by the Food and Drug Administration. That robotic arm is close to commercialization, said Geoffrey Ling, acting deputy director of Darpas Defense Sciences Office. We have pictures of young men doing rock climbing and one of the patients using chopsticks, which is really extraordinary, he said. It provides a high degree of functionality, and the patients who have it are using it.
John Markoff
References
[1] The New York Times|
http://www.nytimes.com/2013/03/30/science/maki ng-robots-mimic-the-human-hand.html?_r=1&
"They have this enormous common pool of genes that they are freely sharing with each other. Bacterial systems can easily add or remove genes from their genomes through what's called horizontal gene transfer, a kind of file sharing between bacteria," Maslov said. The same goes for Linux operating systems, which allow free installation of components built and shared by a multitude of designers independently of one another. The theory wouldn't hold true for, say, a Windows operating system, which only runs proprietary programs. Maslov is co-principal investigator in the KBase program, which is led by principal investigator Adam Arkin of DOE's Lawrence Berkeley National Laboratory, with additional co-principal investigators Rick Stevens of DOE's Argonne National Laboratory and Robert Cottingham of DOE's Oak Ridge National Laboratory. Supported by DOE's Office of Science, the KBase program provides a high-performance computing environment that enables researchers to access, integrate, analyze and share large-scale genomic data to facilitate scientific collaboration and accelerate the pace of scientific discovery. DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.
Chelsea Whyte,
Phone (631) 344-8671 Phone(631) 344-3174
Peter Genzer,
References
[1] BrookHaven National Lab|
http://www.bnl.gov/newsroom/news.php?a=11518
10
11
Wendy Lecluyse added: This new device opens up many intriguing research possibilities allowing scientists to explore new ideas in hearing aid design and how they work in everyday settings. At the moment, we are particularly interested to find out how the preferred setting of each user corresponds with their hearing problem. The development of BioAid, which has been funded by the Engineering and Physical Sciences Research Council, is part of a research project to influence the future of hearing aids. The researchers want to hear about peoples experiences using BioAid so that they can continue to perfect the technology. Users can get in touch, and find further information at:http://bioaid.org.uk/.
University of Essex
Telephone: 01206 873529 E-mail: comms@essex.ac.uk.
References
[1] University of Essex|
http://www.essex.ac.uk/news/event.aspx?e_id=5095
12
13
14
But sometimes, Muir said, flames can make it difficult to see everything, so if holograms could be combined with thermal imaging to create a more complete picture, it would be a welcome addition.
Colin Wood
References
[1] Government Technology|
http://www.govtech.com/public-safety/HologramsAdd-New-Dimension-to-Fighting-Fire.html
15
A great example of this cooperative nature is Wikipedias article on former U.S. President George W. Bush a highly contested piece of Wiki real estate that has been edited some 45,000 times. Show me a place on the Internet where people agree about George W. Bush? asks DeDeo. But the Wikipedia article reads as if it was written by aliens who didn't care [about Bush] although we know it was written by people who cared a lot. Just how Wikipedia manages this collective balance is something DeDeo was able to study in detail because, unlike most other social systems, every Wikipedia edit is recorded. It's almost like you had closed circuit cameras running as a society is creating itself, he says, so every move could be studied and watched. All these sequences of behaviors create what can be viewed as a historical grammar, like that of a language or even bird song. A bird song, for example, has very simple grammar, with few elements and combinations possible what's called a finite-state system. The historical language that creates and maintains Wikipedia might be expected to follow a rather limited grammar as well, but that's not what DeDeo discovered. The big result is that the Wikipedia behavior is what we call non-finite state, DeDeo says. Its constantly generating new patterns of behavior that havent been seen before.
One possibility, he says, is that the unbounded source for these behavior patterns in Wikipedia is shared between people its the product of everyones mind. That's what's really exciting, he says.
References
[1] SANTA FE INSTITUTE|
http://www.santafe.edu/news/item/dedeowikipedia-shared-super-mind/
16
17
When asked about the supercomputing goals the NSF wants to support over the next five years, Blatecky said that the real mission is to support a broader group of scientific users, especially those working in hot applications like genomics, materials science and environmental research areas. Most of their plans revolve around socially-oriented missions, including studies to predict earthquakes, flood outcomes, disaster response situations, and medically-driven research on the HIV and epidemic modelling fronts. We also talked briefly about how HPC as we know it--and the NSF funds it--could change over the next five years. "I don't know what it will be," he noted, but he has no doubt that the performance-driven architectures might not be enough to keep up with the very real data explosion across real science applications unless they strike that memory/storage/power balance that Blue Waters has. While not all HPC application are necessarily hugely data-intensive, a look down the list of applications reveals some of the highest data volume-driven research areas in science, particularly around medical and earth sciences projects. TACC, for instance, will now be the centre of some cutting-edge earthquake, environmental and ecological research as scientists from around the world bring their best and brightest ideas --not to mention an unprecedented level of data--to the common table of the shared resource. As TACC Director Jay Boisseau stated upon the formal announcement of Stampede yesterday, the system has been designed to support a large, diverse research community. We are as excited about Stampede's comprehensive capabilities and its high usability as we are of its tremendous performance."
On that note, 90% of TACCs new powerhouse will be dedicated to the XSEDE program, which is a unified virtualized system that lets global scientists tap into powerful systems, new data wells and computational tools through one hub. TACC will tap into the remaining horsepower for larger goals within its own centre and in the University of Texas research community. And there is certainly some power to the system. As TACC described cleanly in their own statement on the specs, the Dell and Intel system boasts the following points of pride: Stampede system components are connected via a fat-tree, FDR InfiniBand interconnect. One hundred and sixty compute racks house compute nodes with dual, eight-core sockets, and feature the new Intel Xeon Phi coprocessors. Additional racks house login, I/O, bigmemory, and general hardware management nodes. Each compute node is provisioned with local storage. A highspeed Lustre file system is backed by 76 I/O servers. Stampede also contains 16 large memory nodes, each with 1 TB of RAM and 32 cores, and 128 standard compute nodes, each with an NVIDIA Kepler K20 GPU, giving users access to large shared-memory computing and remote visualization capabilities, respectively. Users will interact with the system via multiple dedicated login servers, and a suite of high-speed data servers. The cluster resource manager for job submission and scheduling will be SLURM (Simple Linux Utility for Resource Management). Unlike Stampede, which is expected to make a top 5 showing on the Top 500m Blue Waters will not be benchmarking for reasons NCSA's Bill Kramer explained to us in detail right around SC12.
18
Of course, not that it needs to convince us that it will be a scientific powerhouse.. The Blue Waters saga began back in 2007 when the NSF funded the super to the tune of $208 million. At the time, IBM was at the heart of the project but refunded their payments for Blue Waters system after looking at the cost versus return equation. Cray was later selected to take over the project with a $188 million contract that would lead the super into completion. In the year since the video below was filmed, work on the system was completed and Blue Waters was installed at NCSA. The 11.6 petaflops (peak) supercomputer contains 237 XE cabinets, each with 24 blade assemblies, and 32 cabinets of the Cray XK6 supercomputer with NVIDIA Tesla GPU computing capability. Currently available in "friendly-user" mode for NCSA-approved teams, Blue Waters provides sustained performance of 1 petaflop or more on a range of real-world science and engineering applications. "Blue Waters is an example of a high-risk, high-reward research infrastructure project that will enable NSF to achieve its mission of funding basic research at the frontiers of science," said NSF Acting Director Cora Marrett. "Its impact on science and engineering discoveries and innovation, as well as on national priorities, such as health, safety and wellbeing, will be extraordinary.
discovery--not (yet) the structure of the HIV virus, but that of a smaller virus, which could only be achieved through a 10 million atom, molecular dynamics simulation, inconceivable before Blue Waters. The team is using Blue Waters to investigate complex and fundamental molecular dynamics problems requiring atomic level simulations that are 10 to 100 times larger than those modelled to date, providing unprecedented insights.
3. Earthquake Prediction
A team at the Southern California Earthquake Centre, led by Thomas Jordan, is carrying out large-scale, highresolution earthquake simulations that incorporate the entire Los Angeles basin, including all natural and human-built infrastructure, requiring orders of magnitude more computing power than studies done to date. Their work will provide better seismic hazard assessments and inform safer building codes: Preparing for the Big One.
Examples
1. Modelling HIV
Blue Waters is enabling Klaus Schulten and his team at UIUC to describe the HIV genome and its behaviour in minute detail, through computations that require the simulations of more than 60 million atoms. They just published a paper in PLOS Pathogens touting an early
19
"Our hope is that the answers we discover will enhance flood assessment, drought monitoring, and the management of water resources in large river basins worldwide.
Nicole Hemsoth
References
[1] HPC WIRE
http://www.hpcwire.com/hpcwire/2013-0328/nsf_official_on_new_supers_dataintensive_future.html
20
21
The Clarius Skills Index during the December 2012 quarter also shows there were 211,700 IT positions (including vacancies) available during the period but only 207,100 professionals available to fill these roles.
Rebecca Merrett
Follow her on Twitter: @Rebecca_Merrett
References
[1] Computer World
http://www.computerworld.com.au/article/457559/ google_australia_funds_universities_spruik_comput er_science/?fp=16&fpid=1
22
Crowd-funding is
References
23