Está en la página 1de 24

13

POWER

Page 4 | Could Google tilt a close election? Page 7 | Making Robots Mimic the Human Hand Page 9 | Researchers Find Surprising Similarities Between Genetic and Computer Codes Page 11 | Mobile APP Turning iPhone into a biologically-inspired hearing aid Page 13 | RIT researchers develop advanced video and image processing Page 14 | Holograms Add New Dimension to Fighting Fire Page 16 | New clues to Wikipedia's shared super mind Page 17 | NSF Official On New Supers, Data-Intensive Future Page 21 | Google Australia funds universities to spruik computer science Page 23 | Crowd-funding is working for Open Source projects

Could Google tilt a close election?


Googles motto is Dont be evil. But what would it mean for democracy if it was? Thats the question psychologist Robert Epstein has been asking in a series of experiments testing the impact of a fictitious search engine he called it Kadoodle that manipulated search rankings, giving an edge to a favoured political candidate by pushing up flattering links and pushing down unflattering ones. Not only could Kadoodle sway the outcome Web browsing history or other factors. of close elections, he says, it could do so in The voters least tuned in to other sources a way most voters would never notice. of information, such as news reports or Epstein, who had a public spat with campaign advertisements, would be most Google last year, offers no evidence of vulnerable. These are the same people actual evil acts by the company. Yet his who often end up in the crucial middle of exploration of Kadoodle think of it as American politics as coveted swing voters. the equivalent of Evil Spock, complete Elections are won among low-information with goatee not only illuminates how voters, said Eli Pariser, former president search engines shape individual choices of MoveOn.org and the author of The but asks whether the government should Filter Bubble: What the Internet Is Hiding have a role in keeping this power in check. From You. The ability to raise a negative They have a tool far more powerful than story about a candidate to a voter ... could an endorsement or a donation to affect the be quite powerful. outcome, Epstein said. You have a tool Even efforts to refine search algorithms, for shaping government. ... Its a huge he said, can unintentionally affect what effect thats basically undetectable. voters see on their results pages. A search There is no reason to believe that Google engine that favours certain news sources would manipulate politically sensitive based, for example, on the sophistication search results. The company depends on of the writing as measured by vocabulary its reputation for presenting fair, useful or sentence length might push to links, and though that image has taken prominence links preferred by highly some hits in recent years with high-profile educated readers, helping the political investigations in the United States and party and ideas they support. Europe, it would be far worse to get Epsteins research is slated to be caught trying to distort search results for presented in Washington this spring at political ends. the annual meeting of the Association for Yet Epsteins core finding that a Psychological Science. The Washington dominant search engine could alter Post shared an advance copy of a five-page perceptions of candidates in close elections research summary with officials at Google. has substantial support. Given the Providing relevant answers has been the wealth of information available about cornerstone of Googles approach to search Internet users, a search engine could even from the very beginning, the company tailor results for certain groups, based on said in a statement. It would undermine location, age, income level, past searches, peoples trust in our results and company if we were to change course.

It certainly is clear that outside groups seek to manipulate Googles results. The consequences of such tactics in the consumer world are well-known, with companies spending vast sums trying to goose search rankings for their products in make-or-break bids for profit. In the political realm, the creators of Google bombs managed to link the name of then-Sen. John Kerry, the Democratic presidential nominee in 2004, with the word waffles in search results. President George W. Bush had his name linked, through similar tactics, to the words miserable failure. In 2010, a conservative group used a collection of linked Twitter accounts to affect search rankings about the Massachusetts special election that brought Scott Brown to the Senate, according to research by two computer science professors at Wellesley College. Google has resisted such tactics, and its vulnerability to manipulation from outside was limited in the 2012 election cycle, according to researchers, political professionals and search experts. Though search results on Google are generated by a complex and ever-changing algorithm weighing, for example, links to other sites, content quality and the time spent on sites when people click through the key factors emphasize relevance to users. The company works to spot and defeat those who seek to alter results unfairly, and it sometimes punishes those who do by demoting their search rankings. But Epsteins argument is based on a different scenario: What if manipulation came from within? Even those who harbour no doubts about Googles intentions generally agree that internal manipulation would be potent and, at least initially, hard to spot. They could do something manually with these results, but I cant see why they would do that, said Mike Grehan, publisher of Search Engine

Watch and a commentator whose views often are in line with Googles. Yet Epstein and some others say the companys power alone whether or not it uses it calls out for legal safeguards. Though Microsoft, Yahoo and Facebook also operate search engines, Google has about two-thirds of the U.S. market. Even if Google has no plan to skew search rankings today, what if conditions or its corporate leadership changed over time? There is a bit of history of some powerful communications companies directly meddling in elections. I dont think Google has an incentive to do this, but a future Google could, said Tim Wu, a Columbia University law professor and the author of The Master Switch: The Rise and Fall of Information Empires. The question of free speech in America is controlled by a few powerful gatekeepers who could subtly shape things. In the 1800s, Wu noted, Western Union employees often read telegrams from Democrats and shared their contents with Republicans their political allies or didnt deliver them. This stopped, Wu said, only with the arrival of forceful federal regulation. Epstein, a Harvard-trained psychologist and former editor in chief of Psychology Today, turned his attention to Google after the company flagged search results for a Web site that he ran, warning that it was infected with malicious programs that could harm visitors. Epstein complained publicly about the move and the lack of responsiveness from Google, e-mailing senior company officials. He later acknowledged that his site had been infiltrated by hackers, but the experience left him aghast at what he considered Googles unchecked power. He wrote blog posts calling for greater regulatory oversight of the company.

For his experiment, conducted with colleague Ronald E. Robertson at the American Institute for Behavioural Research and Technology, Epstein attempted to shape the perceptions of a random sampling of potential voters in California. The test involved an election most of the subjects knew little about: a close-fought campaign for prime minister of Australia in 2010. The researchers secretly altered the rankings of search results to help favoured candidates. After 15 minutes of searching and reading linked articles, it was clear that the manipulation had worked, with about 65 percent of subjects favouring the candidate getting elevated rankings, compared with 50 percent among a control group that saw impartial search results, according to Epstein. Three out of four subjects, meanwhile, reported no awareness that the search rankings had been altered. The lack of prior knowledge about the race or alternative sources of information accentuated the effects of the search rankings, Epstein acknowledged. But he said the experiment made clear that manipulation is possible, powerful and hard to detect. However, the sheer volume of other information available to voters would make such manipulation hard to execute, said David Vladeck, a Georgetown University law professor and the former head of consumer protection at the Federal Trade Commission. Traditional news organizations, he said, probably have more power over the views of voters. It is not clear to me that, even if Google tried to, it could exercise the same power over the American public as Fox News or MSNBC, Vladeck said. The claim is such a difficult one to sustain that I find it hard to take it seriously.

Federal regulations have in some circumstances limited what news organizations can do. The Fairness Doctrineonce required broadcasters to present both sides of controversial issues, and media crossownership rules can still limit the ability of newspapers, for example, to own radio or television stations in the same metropolitan area. Some legal scholars contend that search engine rankings are covered under the First Amendments free speech protections. Yet, even those who think that search engines can have potent effects on elections differ on what kind of regulation, if any, would be sensible and effective. And its not even clear what federal agency would have the authority to investigate allegations of abuse. The key lesson may be that search engines are not mere machines spitting out perfectly impartial results. They are driven by decisions, made by people who have biases. This does not necessarily make them evil merely human. The more trust we give to these kinds of tools, the more likely we can be manipulated down the road, said Panagiotis T. Metaxas, one of the computer science professors at Wellesley College who studied the Massachusetts election. We need to understand, as people, as citizens, why we believe what we believe.

Craig Timberg
timbergc@washpost.com

References
[1] The Washington Post|
http://www.washingtonpost.com/opinions/couldgoogle-tilt-a-close-election/2013/03/29/c8d7f4e69587-11e2-b6f0-a5150a247b6a_story.html

Making Robots Mimic the Human Hand


As part of a national research project to develop low-cost artificial hands, the Pentagon has released a video of a robot that can change a tire almost. In the video, the twoarmed robot uses a tool to remove a tire from a car. Were almost at the stage where we can put the the nuts back onto the bolts, said Gill Pratt, a program manager at the Pentagons Defense Advanced Research Projects Agency, or Darpa. The goal of the program, now in its third In one Darpa video, a robot hand picks up phase, is to develop robots and prosthetic a tweezers and uses it to pick up a straw devices for wide use. Until now, high cost and move it back and forth, Dr. Pratt said. as well as limits on dexterity and machine The various hands are still a work in vision have been major obstacles to progress, he noted. The tire-changing advanced robotic systems. video was made when we were using the Robotic hands that mimic the capabilities old hands and not the new hands, and of the human hand have cost $10,000 or they did not quite have the dexterity to more, and computer vision systems have thread the nut onto the bolt in a way that worked only in highly structured it doesnt cross the thread. environments on a very limited set of Darpa also set out tasks that it hopes to objects. accomplish during the next phase. One But it is becoming feasible to make hands example is to design a robot arm and hand that will cost less than $3,000 in that can search for an improvised quantities of 1,000. Two teams from explosive device, or I.E.D., by touch. The iRobot, a robot maker in Bedford, Mass., challenge would be to program a hand and the governments Sandia National that could open the zipper on a gym bag Laboratories in New Mexico are and then go through the bag and working on the hand project; they employ recognize objects by touch. a variety of widely available technologies, The agency is also financing research like cellphone cameras and sensors, to groups in two other categories. It has help lower costs. selected the National Robotics Were definitely watching their progress, Engineering Center at Carnegie Mellon said Rodney Brooks, founder of Rethink University, NASAs Jet Propulsion Robotics, a Boston-based maker of low-cost Laboratory and the University of manufacturing robot systems. Southern California to continue The Darpa research has been vital in development of high-level software for the keeping the United States in the forefront next generation of robot arms. of robotic technology, he said. He likened Until recently, the agency asked software the current work to Darpa projects in the developers to develop robotic programs for 1980s and 1990s that led to the robotic generic individual motions, like moving navigation technologies crucial to the forward or backward; now it has set out to development of self-driving automobiles. simply have the robots perform a specific One of the hands under development task. comes with three fingers and the other You could say things like pick up the comes with four, and they are able to do a bottle, unlock the door, tasks like that, variety of delicate operations. Dr. Pratt said.

The agency began with six teams and held a bake-off in which it chose three teams to continue in the last phase of the project. In the software project, Darpa supplied each team with a standard hand that it then programmed. The grasping tasks were done so well that we believe that for the kinds of objects we had them pick up ranging from a ball to a rock to tools like hammers we dont need to do further work in grasping, Dr. Pratt said. Manipulating grasped objects was a more challenging task, he said, and one on which the teams would continue to do research. The program is financed for 18 more months. Darpa is also continuing to finance the development of low-cost arms at Barrett Technologies, a robotics research firm in Cambridge, Mass.; Sandia; iRobot; and SRI International, a research organization in Menlo Park, Calif. The agency is also planning to create a joint project to transfer some of the lowcost technology advances it has made in the project into a related effort to develop prosthetic limbs for wounded soldiers. Johns Hopkins University has received funds to develop a neural interface a direct link from a robot arm to the human brain and DEKA Research, an independent development laboratory headed by Dean Kamen in Manchester, N.H., has developed a separate wearable arm now being considered for approval by the Food and Drug Administration. That robotic arm is close to commercialization, said Geoffrey Ling, acting deputy director of Darpas Defense Sciences Office. We have pictures of young men doing rock climbing and one of the patients using chopsticks, which is really extraordinary, he said. It provides a high degree of functionality, and the patients who have it are using it.

John Markoff
References
[1] The New York Times|
http://www.nytimes.com/2013/03/30/science/maki ng-robots-mimic-the-human-hand.html?_r=1&

Between Genetic& Computer Codes


The term "survival of the fittest" refers to natural selection in biological systems, but Darwin's theory may apply more broadly than that. New research from the U.S. Department of Energy's Brookhaven National Laboratory shows that this evolutionary theory also applies to technological systems. Computational biologist Sergei Maslov of Brookhaven National Laboratory worked with graduate student Tin Yau Pang from Stony Brook University to compare the frequency with which components "survive" in two complex systems: bacterial genomes and operating systems on Linux computers. Their work is published in the Proceedings of the National Academy of Sciences. Maslov and Pang set out to determine not only why some specialized genes or computer programs are very common while others are fairly rare, but to see how many components in any system are so important that they can't be eliminated. "If a bacteria genome doesn't have a particular gene, it will be dead on arrival," Maslov said. "How many of those genes are there? The same goes for large software systems. They have multiple components that work together and the systems require just the right components working together to thrive.'" Using data from the massive sequencing of bacterial genomes, now a part of the DOE Systems Biology Knowledgebase (KBase), Maslov and Pang examined the frequency of usage of crucial bits of genetic code in the metabolic processes of 500 bacterial species and found a surprising similarity with the frequency of installation of 200,000 Linux packages on more than 2 million individual computers. Linux is an open source software collaboration that allows designers to modify source code to create programs for public use. The most frequently used components in both the biological and computer systems are those that allow for the most descendants. That is, the more a component is relied upon by others, the more likely it is to be required for full functionality of a system. It may seem logical, but the surprising part of this finding is how universal it is. "It is almost expected that the frequency of usage of any component is correlated with how many other components depend on it," said Maslov. "But we found that we can determine the number of crucial components those without which other components couldn't function by a simple calculation that holds true both in biological systems and computer systems." For both the bacteria and the computing systems, take the square root of the interdependent components and you can find the number of key components that are so important that not a single other piece can get by without them. Maslov's finding applies equally to these complex networks because they are both examples of open access systems with components that are independently installed. "Bacteria are the ultimate BitTorrents of biology," he said, referring to a popular file-sharing protocol.

Researchers Find Surprising Similarities

"They have this enormous common pool of genes that they are freely sharing with each other. Bacterial systems can easily add or remove genes from their genomes through what's called horizontal gene transfer, a kind of file sharing between bacteria," Maslov said. The same goes for Linux operating systems, which allow free installation of components built and shared by a multitude of designers independently of one another. The theory wouldn't hold true for, say, a Windows operating system, which only runs proprietary programs. Maslov is co-principal investigator in the KBase program, which is led by principal investigator Adam Arkin of DOE's Lawrence Berkeley National Laboratory, with additional co-principal investigators Rick Stevens of DOE's Argonne National Laboratory and Robert Cottingham of DOE's Oak Ridge National Laboratory. Supported by DOE's Office of Science, the KBase program provides a high-performance computing environment that enables researchers to access, integrate, analyze and share large-scale genomic data to facilitate scientific collaboration and accelerate the pace of scientific discovery. DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

Chelsea Whyte,
Phone (631) 344-8671 Phone(631) 344-3174

Peter Genzer,
References
[1] BrookHaven National Lab|
http://www.bnl.gov/newsroom/news.php?a=11518

10

Mobile App Turns iPhone Into


A Biologically-inspired Hearing Aid
Researchers at the University of Essex have developed a free mobile app that turns an iPhone or iPod into a hearing aid that could revolutionise the future for people with hearing loss.Unlike standard hearing aids that simply amplify all sounds, the BioAid app is inspired by biology and replicates the complexities of the human ear. It puts the user in control, is available to anyone, anywhere without the need for a hearing test, and potentially holds the key to a future where tiny, phone-based hearing aids can be dispensed and adjusted remotely. BioAid, which is available on iTunes has Nick Clark added: The mobile phone is a been developed by Professor Ray Meddis great platform for rapidly transferring of Essexs Department of Psychology with hearing aid technology from the Nick Clark, formerly a Research Officer in laboratory to the hands of the public. the Department and Dr Wendy Lecluyse Standard hearing aids, which can cost of University Campus Suffolk. Unlike thousands of pounds, are only dispensed standard aids that have a single setting, by a professional after a hearing test. BioAid has six fixed settings each of which BioAid offers a simple alternative has four fine-tuning settings allowing the accessible to anyone with an iPhone or user to find the perfect match for their iPod. The hearing test is replaced by an impairment. exploratory process, allowing users to find Professor Meddis said: We are very which setting works best for them. In the excited about the potential of BioAid short term, people unsure about visiting a which could genuinely change lives. hearing care professional might be swayed People with hearing impairment very to do so by BioAid, which can only be a often withdraw from public life. Even if good thing. they have a hearing aid, the technology is As phones get smaller and technology not sophisticated enough to offer a tailor- continues to advance, the researchers made solution to their impairment and in believe the BioAid project has the many cases people simply stop using potential to radically change the future of them. hearing devices. Professor Meddis Sounds are a complicated mixture of explained: Its not inconceivable that well different frequencies and hearing loss is wear phones on our wrist in the near usually a loss of sensitivity to some but future, or even as tiny devices behind the not all frequencies. Standard hearing aids ear. With the BioAid algorithm and wi-fi amplify some frequencies more than technology, we could see dispensers able others but BioAid is different because it to remotely adjust the settings on a phonealso compresses the very loud sounds that based aid and even monitor use to ensure can make social situations like going to the user is getting the most out of it. the pub, cinema or a birthday party intolerable.

11

Wendy Lecluyse added: This new device opens up many intriguing research possibilities allowing scientists to explore new ideas in hearing aid design and how they work in everyday settings. At the moment, we are particularly interested to find out how the preferred setting of each user corresponds with their hearing problem. The development of BioAid, which has been funded by the Engineering and Physical Sciences Research Council, is part of a research project to influence the future of hearing aids. The researchers want to hear about peoples experiences using BioAid so that they can continue to perfect the technology. Users can get in touch, and find further information at:http://bioaid.org.uk/.

University of Essex
Telephone: 01206 873529 E-mail: comms@essex.ac.uk.

References
[1] University of Essex|
http://www.essex.ac.uk/news/event.aspx?e_id=5095

12

RIT Researchers develop advanced video & image processing


Rapid developments in satellite and sensor technologies have increased the availability of high-resolution, remotely sensed images faster than researchers can process and analyze the data manually. Researchers at Rochester Institute of Technology are developing advanced intelligence processing technologies to handle those large volumes of data in a timely manner, and to effectively distinguish objects, scale, complexity and organization. Eli Saber, professor of electrical and that somebody actually has to look at, just microelectronic engineering in RITs Kate the highlights so they can process the Gleason College of Engineering, and information to make decisions. David Messinger, associate research Both projects advance work done by professor of imaging science in the researchers in the area of image universitys Chester F. Carlson Center for segmentation, with this newest research Imaging Science, were awarded two focused on advanced video processing. grants, totaling more than $1.1 million, Messinger and Sabers project team will from the Department of Defense to develop complex computer algorithms to promote a platform for intelligent continue advancing this technology. The first, Hierarchical Representation of computer processing. Computers interpret Remote Senses Multimodal Imagery was object information from images and video awarded $576,042 to advance the as a two-dimensional plane, unlike foundation for object-based image analysis humans who understand an objects threeof remotely sensed images, and to explore dimensional aspects, says Saber. the use of topological features to improve We struggle in doing the proper video classification and detection results. The segmentation intelligently, he adds. How second grant, Spatio - Temporal do computers form this recognition that Segmentation of Full Motion Airborne we as humans have understood for most of Video Imagery, was awarded $576, 043 our lives? How do you get the computer to and focuses on development of a recognize images the same as humans segmentation methodology to differentiate would do it? It is a problem that is largely the unique cues of moving and still objects unsolved and difficult. The system the team is producing would derived from full motion video capture. It all comes down to efficiently handling be adaptable for identifying structures, large amounts of image data collected objects of various sizes, shapes and from satellites and video streams, which timescales, says Messinger. are not necessarily big images, but I can It has to be flexible enough to capture all collect video for hours, says Messinger, of that information in multiple spatial and who also serves as the director of RITs temporal scales, he says. I want to be Digital Imaging and Remote Sensing able to process it to extract information Laboratory. Youd like to be able to automatically, so I can make the process download the data, have it go into a more efficient for the end user. computer system and have it reduce that http://www.rit.edu/news/story.php? eight hours of video down to 20 minutes id=49877

13

Holograms Add New Dimension to Fighting Fire


The use of thermal imaging in fighting fires is 25 years old this year the first documented life saved by the technology goes back to a New York City fire in 1988. Though it took years for thermal imaging technology to become widespread due to cost, once it was well established in fire fighting, a direct connection between their use and the preservation of life was clear. And now, a new device being developed by researchers could further augment this live-saving technology. In Italy, Pietro Ferraro of the Consiglio The invention can scan for data and Nazionale delle Ricerche (CNR) Istituto process the data in quasi-real time, he Nazionale di Ottica (National Research said, generating a rapidly updated 3-D Council - National Institute of Optics), is image of a room or area. using hologram technology to create three- Because the software demands a relatively dimensional images that would allow fire small amount of processing power from a fighters to see through smoke and flames computer, the processing could be during a rescue. performed by a common laptop or mobile Though thermal imaging can see through device. We strongly think that this part smoke, the presence of flames can obscure can be performed at a fire scene," he said, objects, such as people in need of rescue. "maybe by a host connected from a mobile Instead of using lenses to generate an station outside the building." image, Ferraros hologram device uses laser beams and something called Ferraros invention isnt available yet, but numerical processing, so the device can Capt. Jon Muir, public information officer see through flames and generate a 3-D of the Orange County Fire Authority in image of a room. If somehow combined California, said it sounds potentially with thermal imaging, the technology useful. Any technology that will assist or could provide yet another layer of aid us in doing what we need to do, Muir information to fire fighters. said, is something worth looking into. For So far, the experiments have been carried 15 years, Muir said hes been using out in a laboratory, but simulating thermal imaging, along with others, to outdoor conditions," Ferraro said make fighting fires safer. via email. "No anti-vibration systems have Thermal imaging has three main uses, been used and no dark-rooms have been Muir said. It can allow fire fighters to employed. For these reasons, we are measure the temperature of a burning strongly confident about the possibility to building and identify what stage the fire is bring this technology out of the lab. We in. Thermal imaging can also help fire think that in a few years, these systems fighters understand the layout of a could be applied for fixed installations, for building and spot weak structural example in hospitals, schools tunnels or elements before they fall. Perhaps most even highways. importantly, thermal imaging can be used The software behind Ferraros to find victims amid the flames. In this experiments works quickly, he said, and a way, thermal imaging has saved lives. single frame of imagery can be http://www.rit.edu/news/story.php? constructed in less than half a second. id=49877

14

But sometimes, Muir said, flames can make it difficult to see everything, so if holograms could be combined with thermal imaging to create a more complete picture, it would be a welcome addition.

Colin Wood
References
[1] Government Technology|
http://www.govtech.com/public-safety/HologramsAdd-New-Dimension-to-Fighting-Fire.html

15

New clues to Wikipedia's shared super mind


Wikipedia's remarkable accuracy and usefulness comes from something larger than the sum of its written contributions, a new study by SFI Research Fellow Simon DeDeo finds. The free, anonymously written and edited online encyclopaedia was widely expected to fall prey to cranks and partisans. Instead, it has proven no less accurate than the venerable Encyclopaedia Britannica, according to several analyses of the quality of its information. "The question is how?

A great example of this cooperative nature is Wikipedias article on former U.S. President George W. Bush a highly contested piece of Wiki real estate that has been edited some 45,000 times. Show me a place on the Internet where people agree about George W. Bush? asks DeDeo. But the Wikipedia article reads as if it was written by aliens who didn't care [about Bush] although we know it was written by people who cared a lot. Just how Wikipedia manages this collective balance is something DeDeo was able to study in detail because, unlike most other social systems, every Wikipedia edit is recorded. It's almost like you had closed circuit cameras running as a society is creating itself, he says, so every move could be studied and watched. All these sequences of behaviors create what can be viewed as a historical grammar, like that of a language or even bird song. A bird song, for example, has very simple grammar, with few elements and combinations possible what's called a finite-state system. The historical language that creates and maintains Wikipedia might be expected to follow a rather limited grammar as well, but that's not what DeDeo discovered. The big result is that the Wikipedia behavior is what we call non-finite state, DeDeo says. Its constantly generating new patterns of behavior that havent been seen before.

One possibility, he says, is that the unbounded source for these behavior patterns in Wikipedia is shared between people its the product of everyones mind. That's what's really exciting, he says.

References
[1] SANTA FE INSTITUTE|
http://www.santafe.edu/news/item/dedeowikipedia-shared-super-mind/

16

NSF Official On New Supers, Data-Intensive Future


It has been a noteworthy week in the world of scientific and technical computing as two long-awaited supercomputers have been formally revved up for big research action. The Dell-Intel scientific workhorse, Stampede, at TACC was ushered into the large-scale distributed research fold yesterday. And at the moment of this writing, the rather storied IBM and then Cray-backed Blue Waters system at NCSA is gearing up for its formal intro. At the heart of both of these systems is While grappling with multiple users some serious monetary backing from the across a distributed system like Stampede National Science Foundation (NSF), which and its XSEDE base is never simple, there has committed several million to seeing are far more pressing challenges. In both supers into the worldno addition to pointing to extensive matter how entangled the path. The application retooling that needs to organization funded the large majority of happen, especially on Blue Waters, there both projects in the name of furthering was one phrase we heard several times-some critical human - centred scientific "big data". projects related to the environment, The ability to take advantage of the large gemonics, disaster preparedness and number of cores on a machine like Blue epidemiology. Waters is one of the biggest challenges We chatted earlier this week with Alan user will face, says Blatecky, who points to Blatecky who directs the NSFs Division of how his organization is providing support Advanced Cyber infrastructure about on the programming and computer science where these supers fit into the front to aid domain specialist scientists. overarching mission of the NSF--and what He said that going forward, the systems the future looks like as applications that will shine for the big science require systems that are as "big data" endeavours of the NSF will be those that ready as they are computationally robust. can strike a balance between being dataBlatecky reiterated that from an NSF intensive systems while retaining the standpoint, these are two major computational power of massive numbers investments in HPC, but they aren't of cores, some of which are being pushed necessarily related in terms of anticipated by accelerators and co-processors. use or application types. As he told us, the As Blatecky detailed, Our point of view at two systems are designed for quite the NSF is focused on the broader base of different purposes. scientific users. Were interested in the One the one hand, the massive Stampede data-intensive computational will cater to a large number of users with requirements, which is part of whats an emphasis on boosting the breadth of unique about Blue Waters. It has that applicationsnot to mention extending needed balance between power, memory what those extended apps are able to and storage to address both the datacrunch. Blue Waters, on the other hand, intensive and computationally-intensive will focus on a much smaller number of applications. users, perhaps as many as a dozen, who have very deep, specific research applications.

17

When asked about the supercomputing goals the NSF wants to support over the next five years, Blatecky said that the real mission is to support a broader group of scientific users, especially those working in hot applications like genomics, materials science and environmental research areas. Most of their plans revolve around socially-oriented missions, including studies to predict earthquakes, flood outcomes, disaster response situations, and medically-driven research on the HIV and epidemic modelling fronts. We also talked briefly about how HPC as we know it--and the NSF funds it--could change over the next five years. "I don't know what it will be," he noted, but he has no doubt that the performance-driven architectures might not be enough to keep up with the very real data explosion across real science applications unless they strike that memory/storage/power balance that Blue Waters has. While not all HPC application are necessarily hugely data-intensive, a look down the list of applications reveals some of the highest data volume-driven research areas in science, particularly around medical and earth sciences projects. TACC, for instance, will now be the centre of some cutting-edge earthquake, environmental and ecological research as scientists from around the world bring their best and brightest ideas --not to mention an unprecedented level of data--to the common table of the shared resource. As TACC Director Jay Boisseau stated upon the formal announcement of Stampede yesterday, the system has been designed to support a large, diverse research community. We are as excited about Stampede's comprehensive capabilities and its high usability as we are of its tremendous performance."

On that note, 90% of TACCs new powerhouse will be dedicated to the XSEDE program, which is a unified virtualized system that lets global scientists tap into powerful systems, new data wells and computational tools through one hub. TACC will tap into the remaining horsepower for larger goals within its own centre and in the University of Texas research community. And there is certainly some power to the system. As TACC described cleanly in their own statement on the specs, the Dell and Intel system boasts the following points of pride: Stampede system components are connected via a fat-tree, FDR InfiniBand interconnect. One hundred and sixty compute racks house compute nodes with dual, eight-core sockets, and feature the new Intel Xeon Phi coprocessors. Additional racks house login, I/O, bigmemory, and general hardware management nodes. Each compute node is provisioned with local storage. A highspeed Lustre file system is backed by 76 I/O servers. Stampede also contains 16 large memory nodes, each with 1 TB of RAM and 32 cores, and 128 standard compute nodes, each with an NVIDIA Kepler K20 GPU, giving users access to large shared-memory computing and remote visualization capabilities, respectively. Users will interact with the system via multiple dedicated login servers, and a suite of high-speed data servers. The cluster resource manager for job submission and scheduling will be SLURM (Simple Linux Utility for Resource Management). Unlike Stampede, which is expected to make a top 5 showing on the Top 500m Blue Waters will not be benchmarking for reasons NCSA's Bill Kramer explained to us in detail right around SC12.

18

Of course, not that it needs to convince us that it will be a scientific powerhouse.. The Blue Waters saga began back in 2007 when the NSF funded the super to the tune of $208 million. At the time, IBM was at the heart of the project but refunded their payments for Blue Waters system after looking at the cost versus return equation. Cray was later selected to take over the project with a $188 million contract that would lead the super into completion. In the year since the video below was filmed, work on the system was completed and Blue Waters was installed at NCSA. The 11.6 petaflops (peak) supercomputer contains 237 XE cabinets, each with 24 blade assemblies, and 32 cabinets of the Cray XK6 supercomputer with NVIDIA Tesla GPU computing capability. Currently available in "friendly-user" mode for NCSA-approved teams, Blue Waters provides sustained performance of 1 petaflop or more on a range of real-world science and engineering applications. "Blue Waters is an example of a high-risk, high-reward research infrastructure project that will enable NSF to achieve its mission of funding basic research at the frontiers of science," said NSF Acting Director Cora Marrett. "Its impact on science and engineering discoveries and innovation, as well as on national priorities, such as health, safety and wellbeing, will be extraordinary.

discovery--not (yet) the structure of the HIV virus, but that of a smaller virus, which could only be achieved through a 10 million atom, molecular dynamics simulation, inconceivable before Blue Waters. The team is using Blue Waters to investigate complex and fundamental molecular dynamics problems requiring atomic level simulations that are 10 to 100 times larger than those modelled to date, providing unprecedented insights.

2. Global Climate Change


Also featured at the dedication event, Cristiana Stan and James Kinter of George Mason University are using Blue Waters to engage in topical research on the role of clouds in modelling the global climate system during present conditions and in future climate change scenarios.

3. Earthquake Prediction
A team at the Southern California Earthquake Centre, led by Thomas Jordan, is carrying out large-scale, highresolution earthquake simulations that incorporate the entire Los Angeles basin, including all natural and human-built infrastructure, requiring orders of magnitude more computing power than studies done to date. Their work will provide better seismic hazard assessments and inform safer building codes: Preparing for the Big One.

Examples
1. Modelling HIV
Blue Waters is enabling Klaus Schulten and his team at UIUC to describe the HIV genome and its behaviour in minute detail, through computations that require the simulations of more than 60 million atoms. They just published a paper in PLOS Pathogens touting an early

4. Flood Assessment, Drought Monitoring& Resource Management


Engineering Professor Patrick Reed and his team from Penn State, Princeton and the Aerospace Corporation, are using Blue Waters to transform understanding and optimization of space-based Earth science satellite constellation designs. "Blue Waters has fundamentally changed the scale and scope of the questions we can explore," he said.

19

"Our hope is that the answers we discover will enhance flood assessment, drought monitoring, and the management of water resources in large river basins worldwide.

6. Fundamental Properties of Nature


Robert Sugar, professor of physics at the University of California, Santa Barbara is using Blue Waters to more fully understand the fundamental laws of nature and to glean knowledge of the early development of the universe. "Blue Waters packs a one-two punch," said Sugar, "Blue Waters enables us to perform the most detailed and realistic simulations of sub-atomic particles and their interactions to date. Studies such as these are a global endeavour, and the large data sets produced on Blue Waters will be shared with researchers worldwide for further discoveries."

Nicole Hemsoth
References
[1] HPC WIRE
http://www.hpcwire.com/hpcwire/2013-0328/nsf_official_on_new_supers_dataintensive_future.html

20

universities to spruik computer science


Google Australia is providing funding to 12 Australian universities this year to develop workshops that help high school teachers promote computer science in their curriculums. Under its Computer Science for High School (CS4HS) program, launched in Australia in 2011, Google provides funds to universities across several countries to develop two- to three-day computer science workshops for the teachers. Funding varies based on the number of We hope that by supporting computer participants and other associated costs science at high school level, well increase and is capped at $15,000 for each the number of bright young Australians program, Google said. The search giant that go into computer science at university provided funding to seven Australian level. universities in 2012. Australian Computer Societys head of The universities to receive funding this policy and external affairs, Adam year are: Redman, has also said that computer The University of Western Australia science education in high schools needs to University of Sydney be updated and would like to see more The University of Queensland support for teachers in delivering this to Macquarie University students. Swinburne University of Technology In some high schools students are Deakin University assessed on how well they can use the The University of Newcastle computer, not on how well they University of Canberra understand the computer. Kids today are The University of Adelaide born technology literate. They dont need University of Tasmania to be taught how to use the computer; Griffith University they need to be taught what makes the University of New South Wales. computer work, Remand said. Google Australia and New Zealands So translated into policy that means a engineering program manager, Sally-Ann greater emphasis and support to teachers Williams, hopes that the increase in the and to encourage high school students to number of universities being funded will learn maths and sciences so they can ensure computer science education at high learn the fundamentals of computational schools is up-to-date with the needs of the maths, for example, and relationship industry and will grow university ICT mapping, and by the time they get to enrolments. university they are not confronted with We need to ensure that were equipping having to figure out what an algorithm is. our students to be future creators, rather According to the ACS Statistical than just consumers, of technology, Compendium 2012, the number of Williams told CIO Australia. students completing an ICT-related Right now, were not well-placed as a degree has halved over a decade, and country to meet demand for the computer women only make up 19.73 per cent of the science graduates that are needed in the total ICT-related occupation workforce. new digital economy.

Google Australia funds

21

The Clarius Skills Index during the December 2012 quarter also shows there were 211,700 IT positions (including vacancies) available during the period but only 207,100 professionals available to fill these roles.

Rebecca Merrett
Follow her on Twitter: @Rebecca_Merrett

References
[1] Computer World
http://www.computerworld.com.au/article/457559/ google_australia_funds_universities_spruik_comput er_science/?fp=16&fpid=1

22

Working for Open Source Projects


Funding Open Source software isn't easy heck, in some cases, it can be darn near impossible but it is certainly a worthwhile goal. And, right now, we're seeing a number of people and organizations tackling this challenge. As you may remember, last month the Fund the development of new, opencompany behind the LiveCode licensed code. And the scope of work and development tool set up a Kick-starter target dollar amount is different too in campaign with the goal of creating an this case only $100,000. But the project Open Source edition of their software. looks to be, at first glance, every bit as They set out to raise roughly $500,000 in valuable. order to fund the endeavour. And they It will be interesting to see if the nailed it. community at large will continue to fund In the final days of the Kick-starter some of these excellent projects. Here's campaign, it seemed like success may not hoping. be on the table this time around. While they had raised a significant amount, they were still considerably short of their goal. Luckily, a surge of folks, I'm assuming [1] Network World some of you reading this are counted http://www.networkworld.com/community/blog/cr among them, pitched in at the last minute owd-funding-working-open-source-projects and took it over the finish line. But that description really doesn't do this success justice. They ended up raising nearly $750,000. Thats 50% more than the base goal, all directed towards taking an existing (successful) Closed Source software and bringing it under an Open license, proving that, at least in specific circumstances, it is possible to fund the migration of a Closed Source product to an Open one all through the power of the community. Simply glorious. In other news, the developers behind the popular image organization tool Shot well have started a crowd-funding campaign (in this case, through Indiegogo) to fund the further development of Geary, their email client. The approach is different Geary is an existing, Open Source, application looking for continued development as they do not have the funding to continue it otherwise but the goal is the same.

Crowd-funding is

References

23

También podría gustarte