Next 25 Results →
January 2016 — February 2016

NASA's Computer Model of the Sun's Magnetic Fields Could Prevent Disaster (Feb 9, 2016)
The sun's magnetic fields are pretty complex and hard to appreciate; however, to make them easier to understand, NASA has built a moving computer model that is aimed at mapping the star's fields. The model itself shows closed magnetic field lines, and open ones which project out into space. The open lines are colored pink or green while bright spots appear when the closed lines run into each other. Studying the solar magnetic fields is important because these magnetic explosions can cause solar ...
Read More



Biomedical Simulation at PSC Gets Major Performance Boost with Anton 2 (Feb 9, 2016)
Simulating even small biological systems has long proven computational difficult. Practically speaking, data-driven bioinformatics such as DNA sequence analysis has progressed more rapidly. Development of Anton 1, the ASIC-based supercomputer specifically designed for simulating molecular dynamics by D. E. Shaw Research in 2008, was a major advance. In 2010 DESRES provided an Anton machine at no charge to the Pittsburgh Supercomputer Center, which in turn provided access to a wider biomedical re...
Read More



Hack-Proof RFID Chips (Feb 8, 2016)
Researchers at MIT and Texas Instruments have developed a new type of radio frequency identification (RFID) chip that is virtually impossible to hack. If such chips were widely adopted, it could mean that an identity thief couldn't steal your credit card number or key card information by sitting next to you at a café, and high-tech burglars couldn't swipe expensive goods from a warehouse and replace them with dummy tags. Texas Instruments has built several prototypes of the new chip, to the res...
Read More



Salinas Hopes to Turn Farm Workers' Children into Computer Scientists (Feb 8, 2016)
With one foot in its fields and another edged toward Silicon Valley, Salinas is trying to reboot itself as the agricultural technology center of California. It hopes to turn the sons and daughters of farmworkers into coders for the next generation of data-driven, automated farming in a valley known as the salad bowl of the world. "We're not trying to reinvent ourselves," said Andrew Myrick, the city's economic development manger. "There's cities all across the country that are trying to attract ...
Read More



Contest Introduces Teens to Booming Field of Cybersecurity (Feb 7, 2016)
The room looked like something you'd see in Palo Alto or Mountain View: pizza boxes strewn across a table at one end, young people clustered around computer screens at the other, working in near silence except for the occasional mumble or electronic bleep. They were York High School students searching for viruses, malware, backdoors, password crackers and other Internet terrors on simulated computer networks. After scanning the contents of one folder, Amelia O'Halloran spotted an ominous file. "...
Read More



Realistic Simulator Replicates Combat Injuries (Feb 7, 2016)
A multi-disciplinary team of researchers at the University of California Los Angeles have developed a detailed computer model of an injured human leg that includes the complex workings of blood flow. The aim of the project, sponsored by the Office of Naval Research, is to provide combat medics with a realistic training tool to enable them to better control hemorrhaging. Medics need to know how to treat traumatic limb injuries and severe bleeding, but existing simulators lacked realistic blood me...
Read More



Closing the Tech Industry Gender Gap (Feb 6, 2016)
I just returned from this year's annual gathering of the World Economic Forum at Davos, where leaders from around the world gathered to discuss the implications of a new industrial revolution. This fourth industrial revolution (after the revolutions brought about by steam power, electricity and electronics) is using digital technology to revolutionize almost every part of our life at an unprecedented pace, from self-driving cars to AI-enabled assistants. One of the biggest implications, outlined...
Read More



Molecular Biology Meets Computer Science Tools in New System for CRISPR (Feb 6, 2016)
A team of researchers from Microsoft and the Broad Institute of MIT and Harvard has developed a new system that allows researchers to more quickly and effectively use the powerful gene editing tool CRISPR. The system, dubbed Azimuth, uses machine learning, in which a computer takes a limited set of training data and uses that to learn how to make predictions about data it hasn't yet seen. In this case, the machine learning system is being used to predict which part of a gene to target when a sci...
Read More



Tackling Inequality in Computer Science (Feb 5, 2016)
When he went to university in South Africa, Riaz Moola came face to face with the huge differences in educational opportunity in his country, particularly in his own subject, Computer Science. Instead of just getting on with his course, Riaz - now a Gates Cambridge Scholar at the University of Cambridge - devised a way of tackling the problem. Inspired by recent MOOC platforms such as Coursera, he created an online course platform adapted to Africa which paired tutors - typically Computer Scienc...
Read More



Will Machines Eliminate Us? (Feb 5, 2016)
Yoshua Bengio leads one of the world’s preeminent research groups developing a powerful AI technique known as deep learning. The startling capabilities that deep learning has given computers in recent years, from human-level voice recognition and image classification to basic conversational skills, have prompted warnings about the progress AI is making toward matching, or perhaps surpassing, human intelligence. Prominent figures such as Stephen Hawking and Elon Musk have even cautioned that ar...
Read More



Tech Startups Working Hard to Sell Culture that Job Hunters Will Buy Into (Feb 4, 2016)
Scott Porad badly wanted to hire Mike Hansen to work at Rover, a Seattle dog-sitting startup. Porad, Rover’s chief technology officer, knew just how to get Hansen, a software developer, to exit the interview process at Google, where he was being considered for a job that would probably pay much more. He would tell Hansen about Rover’s thoroughly dog-friendly benefits. “Mike is the owner of two dogs that he loves,” Porad said. “He felt like the purpose of what we were doing was more mea...
Read More



The Strange Rituals of Silicon Valley Intern Recruiting (Feb 4, 2016)
Throughout the academic year, the Wozniak hosts the tech companies who come to Berkeley hoping to recruit computer-science and engineering students for internships and post-graduate programs. The attendee count at these events typically numbers well into the hundreds—computer science is now the most popular major at Berkeley, which has an undergraduate population of around 27,000 students. An introductory computer-science course, CS61A, had 1,277 students enrolled last semester, all of them en...
Read More



Can Tech Generation Stay the Course? Computer Says... No! (Feb 3, 2016)
Drop-out rates for some IT courses are as high as 70%, according to a report from the Higher Education Authority published this week. And in individual maths-related courses across third-level education, up to 80% of students are failing to progress beyond year one. There are now serious concerns about high and persistent skill shortages across the information and computer technology sector. And while there are various theories as to why so many students are failing, it is clear (as the report f...
Read More



Designing for a Science Fiction Future (Feb 3, 2016)
When Julian Bleecker first set foot in a lab at the University of Washington Seattle researching human-computer interaction, he was at a loss. He’d studied electrical engineering as an undergrad, so the lab’s work on an early version of virtual reality was unfamiliar ground. To get the lay of the land, Bleecker was told to read “Neuromancer,” the 1984 science fiction novel by William Gibson in which people connect computers to their brains and experience the data of cyberspace as if it h...
Read More



New Club Focuses on Gender Gap in Technology (Feb 2, 2016)
Members of Prospect High School's new Girls Who Code club learned from the founder of the national non-profit organization that it's important to fail. "When I was growing up in Schaumburg, I was terrified of math and science … and my parents were both engineers," Reshma Saujani, CEO of the group that aims to close the gender gap in technology by inspiring girls to pursue computer science-related careers, said last week. "In life, you should be failing as much as you can, and trying to be impe...
Read More



Invasion of the Data Scientists: Hot Job of 2016 Expands Beyond Tech (Feb 2, 2016)
Data scientist, named the best job in America for 2016 by job site Glassdoor, is the sexy mashup of traditional careers from data analysis, economics, statistics, computer science and others. But it goes beyond collecting and analyzing data. It's a job for the curious, for the intuitive and for those who like to not just solve problems but figure out the problem. It's part science, part art. The rise of data science is due to the explosive growth of data collection — or big data — and the ne...
Read More



Obama Pledges $4 Billion to Computer Science in US Schools (Feb 1, 2016)
President Obama pledged $4 billion in funding for computer science education in the nation’s schools. The Computer Science for All Initiative slated for the president’s forthcoming budget plan would include an additional $100 million that would go directly to school districts to fund computer science programs. Under the president’s plan, the Department of Education will divide the $4 billion over three years to states that propose well-designed five-year plans to increase computer science ...
Read More



Go-playing Google Deepmind AlphaGo Computer Defeats Human Champion (Feb 1, 2016)
You can chalk it up as another victory for the machines. In what they called a milestone achievement for artificial intelligence, scientists said on Wednesday they have created a computer program that beat a professional human player at the complex board game called Go, which originated in ancient China. The feat recalled IBM supercomputer Deep Blue's 1997 match victory over chess world champion Garry Kasparov. But Go, a strategy board game most popular in places like China, South Korea and Japa...
Read More



Engineering Must Focus on Making Science Work for People (Jan 31, 2016)
I attended a conference in the 1980s with 600 people in the plenary session. The speaker asked who would recommend engineering to their children. Only six hands went up: 1 per cent. That this was an engineering conference made the result even more alarming. It may go some way to explain why today engineers make up just 9 per cent of the workforce, of which 2 per cent are female. The need to promote STEM subjects in school is well documented, but there is something more fundamental that needs to...
Read More



Long Live the King (Jan 31, 2016)
Upgrading legacy HPC systems relies as much on the requirements of the user base as it does on the budget of the institution buying the system. There is a gamut of technology and deployment methods to choose from, and the picture is further complicated by infrastructure such as cooling equipment, storage, networking – all of which must fit into the available space. However, in most cases it is the requirements of the codes and applications being run on the system that ultimately define choice ...
Read More



How Are All Those Screens Changing Kids' Behavior? (Jan 30, 2016)
In a recent op-ed in the New York Times, businessman and author Tony Schwartz offered an honest yet troubling account of what he calls his "addiction" to technology. The medical and research fields have not yet come to a clear consensus on what constitutes technology "addiction" and which factors distinguish a true technology addiction disorder from problematic use or just bad habits. But the behavior Schwartz describes is remarkably familiar to those of us who are on our devices more than we kn...
Read More



How Blockchain Tech Could Change the Way We Do Business (Jan 30, 2016)
Blockchain - the technology underpinning digital currency Bitcoin - has been in the news lately. Banks think it could be the future of financial transactions, while diamond miners hope it will help end the trade in conflict diamonds. The UK's chief scientific adviser encouraged the British government to adopt the technology. But what exactly is it and why is it causing such a stir? Technology of Business (tries) to explain. Blockchain is a method of recording data - a digital ledger of transacti...
Read More



New Reconstruction Method Improves Facial Recognition for Forensic Purposes (Jan 29, 2016)
Researchers of the Services, Cybersecurity and Safety department of the University of Twente have invented an improved reconstruction method for facial recognition based on camera images. This method yields a better score in ninety percent of the examined cases, and helps forensic investigators with their daily work. The researchers recently published their results in the academic journal IET Biometrics. Facial recognition in a forensic context is a complex discipline. The reconstruction of face...
Read More



The Death and Life of Traditional HPC (Jan 29, 2016)
When three distinguished Intel Fellows—Bill Magro, Mark Seager and Al Gara—sat down together to discuss HPC’s Next Phase, the conversation was quite lively because all three are working on cutting edge aspects of the rapidly changing and evolving technology portfolio for the high performance computing ecosystem. Moderating the panel discussion at SC15, Intel’s Mike Bernhardt kicked off the discussion with a question about the current wall for HPC memory and storage technology and how we ...
Read More



New 'Moonshot' Effort to Understand the Brain Brings AI Closer to Reality (Jan 28, 2016)
The Intelligence Advanced Research Projects Activity (IARPA) funds large-scale research programs that address the most difficult challenges facing the intelligence community. Today, intelligence agencies are inundated with data - more than they are able to analyze in a reasonable amount of time. Humans, naturally good at recognizing patterns, can't keep pace. The pattern-recognition and learning abilities of machines, meanwhile, still pale in comparison to even the simplest mammalian brains. IAR...
Read More

©1994-2016   |   Shodor   |   Privacy Policy   |   NSDL   |   XSEDE   |   Blue Waters   |   ACM SIGHPC   |   feedback  |   facebook   |   twitter   |   rss   |   youtube Not Logged In. Login