Next 25 Results →
← Previous 25 Results
July 2018 — July 2018

Hackers Easily Fool Artificial Intelligences (Jul 25, 2018)
Last week, at the International Conference on Machine Learning (ICML), a group of researchers described a turtle they had 3D printed. Most people would say it looks just like a turtle, but an artificial intelligence (AI) algorithm saw it differently. Most of the time, the AI thought the turtle looked like a rifle. Similarly, it saw a 3D-printed baseball as an espresso. These are examples of “adversarial attacks”—subtly altered images, objects, or sounds that fool AIs without setting off hu...
Read More



Reversing Cause and Effect is No Trouble for Quantum Computers (Jul 24, 2018)
In research published in Physical Review X, the international team show that a quantum computer is less in thrall to the arrow of time than a classical computer. In some cases, it's as if the quantum computer doesn't need to distinguish between cause and effect at all. The new work is inspired by an influential discovery made almost ten years ago by complexity scientists James Crutchfield and John Mahoney at the University of California, Davis. They showed that many statistical data sequences wi...
Read More



Eagle-Eyed Machine Learning Algorithm Outdoes Human Experts (Jul 24, 2018)
Artificial intelligence is now so smart that silicon brains frequently outthink people. Computers operate self-driving cars, pick friends' faces out of photos on Facebook, and are learning to take on jobs typically entrusted only to human experts. Researchers from the University of Wisconsin-Madison and Oak Ridge National Laboratory have trained computers to quickly and consistently detect and analyze microscopic radiation damage to materials under consideration for nuclear reactors. And the com...
Read More



NSF Awards More than $150 Million to Early Career Researchers (Jul 23, 2018)
The National Science Foundation (NSF) has invested $150 million in 307 early career engineering and computer science faculty to advance fields from intelligent infrastructure and collaborative robots to secure communications and brain-related technologies. Over the next five years, each researcher will receive up to $500,000 from NSF to build a firm scientific footing for solving challenges and scaling new heights for the nation, as well as serve as academic role models in research and education...
Read More



Physicians’ “Gut Feelings” Influence How Many Tests They Order for Patients (Jul 23, 2018)
Many technology companies are working on artificial intelligence systems that can analyze medical data to help diagnose or treat health problems. Such systems raise the question of whether this kind of technology can perform as well as a human doctor. A new study from MIT computer scientists suggests that human doctors provide a dimension that, as yet, artificial intelligence does not.



Major Companies Partner with Colleges for Education Opportunities in Emerging Tech (Jul 22, 2018)
As recent graduates enter the workforce, many employers are not as confident in the skills these new entrants bring with them. Only 35 percent of employers report feeling confident that new recruits are well prepared with hard technical skills and “soft skills” such as complex problem-solving and analytical reasoning, according to a recent Bloomberg Next report.



Focused on Filling Gaps in Computer Science Workforce (Jul 22, 2018)
Paul Gross, assistant professor of software development, says he works on the practical side of computer science. “It’s the side where it’s more important to know how to build software than it is to do research,” he says. “We have a massive shortage of people who can work on that practical side, people who have different areas of expertise. We need graduates who have studied chemistry or psychology, who can take that specific knowledge and develop software.”



NSF’s IceCube Observatory Finds First Evidence of Cosmic Neutrino Source (Jul 21, 2018)
In 1911 and 1912, Austrian physicist Victor Franz Hess made a series of ascents in a hydrogen balloon in a quest to find the source of ionizing radiation that registered on an electroscope. The prevailing theory was that the radiation came from the rocks of the Earth. During the last of his seven flights, Hess ascended to more than 5,300 meters – almost 17,400 feet – to find that the rate of ionization was three times of that at sea level.



Education Technology’s Machine Learning Problem and Responsibility (Jul 21, 2018)
From Formula 1 to Yelp, industries across the board are seeking ways to apply machine learning to their work. Even academics and Goldman Sachs analysts tried using it to predict World Cup winners. (Those predictions proved very, very wrong.) But how is machine learning playing out in education—and how does it impact not just students, educators and parents, but also the businesses building technology tools to support teaching and learning?



Net Neutrality and Impact on Cloud Computing (Jul 20, 2018)
The FCC’s repeal of net neutrality rules, which went live June 11, effectively turns Internet Service Providers (ISPs) into an oligarchy beyond what they already were, now capable of charging both customers and service providers for access. Conceptually, Internet in the US may look very different than the rest of the world in the coming years.



A Sea Change Coming for Water Cooling in Datacenters (Jul 20, 2018)
Back in the late 1970s and early 1980s, big iron in datacenters had to have water cooling, which was a pain in the neck in terms of the system and facilities engineering. And it was a big deal – and a big competitive threat – when former IBM System/360 architect Gene Amdahl – you know, the guy with the law named after him – left to start his own mainframe company and created a line of compatible mainframes that were strictly air cooled.



‘Big Data Challenges and Advanced Computing Solutions’ Focus of House Committee Meeting (Jul 19, 2018)
On Thursday, July 12, the House Committee on Science, Space, and Technology heard from four academic and industry leaders – representatives from Berkeley Lab, Argonne Lab, GE Global Research and Carnegie Mellon University – on the opportunities springing from the intersection of machine learning and advanced-scale computing.



How to Become a Dynamic Technology Leader (Jul 19, 2018)
The core of Information Technology within businesses has always been the implementation and facilitation of technology to meet the needs of the company. The explosion of the digital realm has seen a rapid expansion in the scope and demands upon the sector. Eighty-four percent of Chief Information Officers (CIOs) report that their role now includes responsibilities that fall outside the traditional IT sphere.



Robots Can’t Hold Stuff Very Well, But You Can Help (Jul 18, 2018)
Imagine, for a moment, the simple act of picking up a playing card from a table. You have a couple of options: Maybe you jam your fingernail under it for leverage or drag it over the edge of a table. Now imagine a robot trying to the do the same thing. Tricky: Most robots don’t have fingernails, or friction-facilitating fingerpads that perfectly mimic ours. So many of these delicate manipulations continue to escape robotic control. But engineers are making steady progress in getting the machin...
Read More



Study Shows Virtual Reality Could Hold the Key to GPs Spotting Child Abuse (Jul 18, 2018)
A three-year research project led by a University of Birmingham academic, working with colleagues from Goldsmiths and University College London, has indicated that virtual reality (VR) could become a vital tool for training General Practitioners (GPs) to look out for hard-to-detect signs of child abuse.



Building the Next Generation of Computational Thinkers (Jul 17, 2018)
Historically, advances in science have taken years to make a real-world impact. One example, lithium-ion batteries, which fueled pocket computing on mobile devices, got their start in 1980 but were not commercialized until more than a decade later. At Argonne National Laboratory, we work hard to accelerate the journey from scientific discovery to impact. Faster, more powerful computers and high-throughput methods for analysis, which allow researchers to execute experiments and simulations more q...
Read More



Learning Hard Quantum Distributions with Variational Autoencoders (Jul 17, 2018)
The exact description of many-body quantum systems represents one of the major challenges in modern physics, because it requires an amount of computational resources that scales exponentially with the size of the system. Simulating the evolution of a state, or even storing its description, rapidly becomes intractable for exact classical algorithms. Recently, machine learning techniques, in the form of restricted Boltzmann machines, have been proposed as a way to efficiently represent certain qua...
Read More



Breakthrough in Construction of Computers for Mimicking Human Brain (Jul 16, 2018)
A computer built to mimic the brain's neural networks produces similar results to that of the best brain-simulation supercomputer software currently used for neural-signaling research, finds a new study published in the open-access journal Frontiers in Neuroscience. Tested for accuracy, speed and energy efficiency, this custom-built computer named SpiNNaker, has the potential to overcome the speed and power consumption problems of conventional supercomputers. The aim is to advance our knowledge ...
Read More



First Machine Learning Method Capable of Accurate Extrapolation (Jul 16, 2018)
Understanding how a robot will react under different conditions is essential to guaranteeing its safe operation. But how do you know what will break a robot without actually damaging it? A new method developed by scientists at the Institute of Science and Technology Austria (IST Austria) and the Max Planck Institute for Intelligent Systems (MPI for Intelligent Systems) is the first machine learning method that can use observations made under safe conditions to make accurate predictions for all p...
Read More



Turn on the Switch (Jul 15, 2018)
Supercomputers are the sports cars of the technology world: fast, glamorous and expensive. This might be why Dag Spicer, senior curator at the Computer History Museum, finds them fascinating. Recently, Spicer and his team in Mountain View, CA, unanimously accepted a piece of TACC's history into their permanent historical collection — sealing its place as a milestone in computing.



Researchers Use Machine Learning to Analyze Movie Preferences (Jul 15, 2018)
Could behavioural economics and machine learning help to better understand consumers' movie preferences? A team of researchers from the University of Cambridge, the University of West England, and the Alan Turing Institute dove deeper into this question, in a fascinating study that combines behavioral economics, business and AI.



HPC Serves as a ‘Rosetta Stone’ for the Information Age (Jul 14, 2018)
In an age defined and transformed by its data, several large-scale scientific instruments around the globe might be viewed as a mother lode of precious data. With names seemingly created for a techno-speak glossary, these interferometers, cyclotrons, sequencers, solenoids, satellite altimeters, and cryo-electron microscopes are churning out data in previously unthinkable and seemingly incomprehensible quantities — billions, trillions and quadrillions of bits and bytes of electro-magnetic code.



D-Wave Demonstrates Large-Scale Programmable Quantum Simulation (Jul 14, 2018)
D-Wave Systems announced the publication of a significant scientific result in the peer-reviewed journal Science. The article, titled “Phase transitions in a programmable spin glass simulator,” details how a D-Wave 2000Q quantum computer was used to predict phase transitions within a particular quantum mechanical system known as the transverse field Ising model.



New Study Finds Taking Breaks Boots Team Performance (Jul 2, 2018)
Want to be a good team player? Take a break. It may improve not only your own performance but the chances of your team winning overall, says a new study by a team of USC computer scientists. Researchers from USC Viterbi’s Information Sciences Institute (ISI) crunched data from thousands of players in a popular online video game to analyze individual performance in teams over time. They also examined the impact of expertise on performance and other factors influencing player behavior, such as c...
Read More



AI for Pharma R&D: Creating Anti-cancer Drugs Faster (Jul 1, 2018)
The costs and process of developing anti-cancer drugs has been an extreme challenge for decades. Today one company, AccutarBio, is harnessing the power of AI to accelerate drug discovery and reform the current “hit-to-lead” drug discovery scheme. The company recently received $15 million in funding (including money from Chinese AI/facial recognition company YITU) and is now partnering with Amgen.

©1994-2024   |   Shodor   |   Privacy Policy   |   NSDL   |   XSEDE   |   Blue Waters   |   ACM SIGHPC   |   feedback  |   facebook   |   twitter   |   rss   |   youtube   |   XSEDE Code of Conduct   |   Not Logged In. Login