Next 25 Results →
← Previous 25 Results
May 2019 — May 2019

It’s Science: ‘Magic: The Gathering’ is the World’s Most Complicated Game (May 14, 2019)
Collectible card game Magic: The Gathering is a challenging cerebral exercise. Now, thanks to a new scientific study, we know that it’s actually the most complex known real-world game there is. As reported by the MIT Technology Review, a new proof shows that optimal play in Magic: The Gathering is so difficult that a computer would be unable to figure out the winner, offering lasting implications for the field of game theory.

Step Towards Light-Based, Brain-Like Computing Chip (May 13, 2019)
A technology that functions like a brain? In these times of artificial intelligence, this no longer seems so far-fetched -- for example, when a mobile phone can recognise faces or languages. With more complex applications, however, computers still quickly come up against their own limitations. One of the reasons for this is that a computer traditionally has separate memory and processor units -- the consequence of which is that all data have to be sent back and forth between the two.

Discovery May Lead to New Material for Next-Generation Data Storage (May 13, 2019)
Research funded in part by the U.S. Army identified properties in materials that could one day lead to applications such as more powerful data storage devices that continue to hold information even after a device has been powered off.

Computing in Schools in ‘Steep Decline’ (May 12, 2019)
An annual study by the University of Roehampton has found that fewer 16-year-olds in England are getting a computing qualification. It also said schools have cut back on the hours spent teaching the subject. In 2018, 130,000 students got a GCSE in either computer science or ICT (information and communications technology), down from 140,000 the previous year. This year, the ICT exam is not an option.

New Tools Could Provide Ironclad Certainty that Computer Bugs are a Thing of the Past (May 12, 2019)
It's bad enough losing an hour's work when your computer crashes—but in settings like healthcare and aviation, software glitches can have far more serious consequences. In one notorious case, a computer bug caused cancer patients to receive lethal overdoses from a radiation therapy machine; in more recent headlines, flawed software was blamed for airplane crashes in Ethiopia and Indonesia.

DOE on Collision Course with End of Moore's Law (May 11, 2019)
As the largest buyer of supercomputers of any government agency in the world, the US Department of Energy (DOE) has relied on the relentless improvement of semiconductors to pursue the science it needs to advance the nation’s energy goals. But because of the slowdown and eventual demise of Moore’s Law, the way it fulfills those needs in the next decade is going to change dramatically.

Three Innovators Changing the Education Landscape Today (May 11, 2019)
America is the land of innovation, leading the world in technology, art and industry — yet we still have a 20th-century educational system. Our schools are stifled by regulatory overload, making it difficult to bring needed change to outdated ways of doing things. Fortunately for our nation’s children, enterprising individuals at all levels of education are working to change that. Here are a few of the most innovative figures in education today.

XSEDE Teams with Cal State for Advanced Computer Training (May 10, 2019)
The need for more young people in STEM careers in a growing concern for the HPC community. Along these lines, more than 70 undergraduate and graduate students participated in the recent XSEDE/DIRECT-STEM training workshops at Cal State LA. The year-long program enables undergraduate and graduate students to participate in a series of voluntary Saturday workshops.

Advanced Performance and Massive Scaling Driven by AI and DL (May 10, 2019)
Artificial Intelligence (AI) is rapidly becoming an essential business and research tool, giving organizations valuable insights into their data and doing so with unprecedented velocity and accuracy. The attraction of AI is its ability to facilitate breakthrough innovations across a variety of fields while delivering significant acceleration in time to insight.

Role Models for Women in Computational Science and Engineering (May 9, 2019)
When we say modelling, we’re not talking strutting down the runway – in science, modelling is any activity that pertains to building or creating any type of models to understand natural or artificial processes, including quantification, interpretation, prediction, simulation, and many other activities related to understanding the world. Models can be physical, theoretical, computational for example, and they help us understand complex systems all around us.

Berkeley Lab Highlights 'the Little Computer Cluster That Could' (May 9, 2019)
Decades before “big data” and “the cloud” were a part of our everyday lives and conversations, a custom computer cluster based at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) enabled physicists around the world to remotely and simultaneously analyze and visualize data.

The Hidden Seismic Symphony in Earthquake Signals (May 8, 2019)
Few months go by without another devastating earthquake somewhere in the world reminding us how we all remain at the mercy of major seismic events that strike without warning. But a new branch of geophysics powered by machine learning is uncovering fresh insights into the earth’s slipping faults that often trigger these catastrophic earthquakes.

How Machine Learning Could Change Science (May 8, 2019)
Scientific progress is inherently unpredictable, tied to sudden bursts of inspiration, unlikely collaborations, and random accidents. But as our tools have improved, the ability to create, invent and innovate has improved with them. The birth of the computing age gave scientists access to the greatest tool yet, with their most powerful variant, supercomputers, helping unlock myriad mysteries and changing the face of the modern world.

HKUST Physicist Contributes to New Record of Quantum Memory Efficiency (May 7, 2019)
Like memories in computers, quantum memories are essential components for quantum computers - a new generation of data processors that obey quantum mechanics laws and can overcome the limitations of classical computers. They may push boundaries of fundamental science and help create new drugs, explain cosmological mysteries, or enhance accuracy of forecasts and optimization plans with their potent computational power.

Scientists Have Created the Largest Simulation of an Entire Gene of DNA (May 7, 2019)
The growing interest in the complexity of biological interactions is continuously driving the need to increase system size in biophysical simulations, requiring not only powerful and advanced hardware but adaptable software that can accommodate a large number of atoms interacting through complex forcefields.

OCR4all: Modern Tool for Old Texts (May 6, 2019)
Historians and other humanities' scholars often have to deal with difficult research objects: centuries-old printed works that are difficult to decipher and often in an unsatisfactory state of conservation. Many of these documents have now been digitized—usually photographed or scanned—and are available online worldwide. For research purposes, this is already a step forward.

New Robust Device May Scale Up Quantum Tech (May 6, 2019)
A theory developed only two years ago proposed a way to make qubits more resilient through combining a semiconductor, indium arsenide, with a superconductor, aluminum, into a planar device. Now, this theory has received experimental support in a device that could also aid the scaling of qubits.

California Students Learn to Code to Prepare for Tomorrow's Jobs (May 5, 2019)
Artaviynia Stanley, 12, who attends Samuel Kennedy Elementary School in Sacramento, wants to be a computer science teacher someday. But her only access to a computer at home is when she borrows her grandmother’s laptop. Now she is one of thousands of students at 260 schools in low-income California communities learning computer coding after school through the Kids Code program.

Key-guessing Blockchain Banditry is Discovered in Security Research (May 5, 2019)
A bandit story of the cryptocurrency kind was a popular item on tech sites this week, with staggering amounts of money scooped up by some blockchain bandit, and spotted by security consultants, Independent Security Evaluators. The weakness comes from poorly implemented private key generation, facilitating cryptocurrency theft. The bandit is guessing private keys— and the bandit is scoring millions.

An Entirely Different Kind of Quantum Computer (May 4, 2019)
It is no accident that quantum computing is being undertaken by some of the biggest IT companies in the world. Google, IBM, and Intel in particular have the capacity to devote a lot of resources to their respective efforts, and all of them demonstrated impressive progress over the past few years. And even though we’ve devoted a lot of our coverage of quantum computing to these three key players, it’s still much too early to tell which companies will come to dominate this nascent market.

How to Turn Africa into a Edtech Hotspot (May 4, 2019)
Across the world, millions of children are out of school. Since 2000, millions of African children have benefited from better access to education, but Africa still has 30 million children out of school. And for those in school, learning is limited because teaching can be poor and resources limited. In some African countries, parents spend half of what they earn on education and half of that on textbooks, if they can find any.

DOE Announces $20 Million in AI Research Funding (May 3, 2019)
The U.S. Department of Energy announced a total of $20 million in funding for innovative research and development in artificial intelligence and machine learning. DOE’s Office of Electricity has selected eight projects to receive nearly $7 million in total to explore the use of big data, artificial intelligence, and machine learning technologies to improve existing knowledge and discover new insights and tools for better grid operation and management. DOE’s Office of Science announced a plan...
Read More

The Importance of Predictive Analytics in Higher Education (May 3, 2019)
Predictive analytics has evolved into a hot button topic among educators in order to better serve students by becoming more data-informed. This is a result of the intense pressure placed on universities to demonstrate an ROI for students as the U.S. dropout rate continues to be at an all-time high.

Faster, More Accurate Diagnoses: Healthcare Applications of AI Research (May 2, 2019)
When Google DeepMind's AlphaGo shockingly defeated legendary Go player Lee Sedol in 2016, the terms artificial intelligence (AI), machine learning and deep learning were propelled into the technological mainstream. AI is generally defined as the capacity for a computer or machine to exhibit or simulate intelligent behavior such as Tesla's self-driving car and Apple's digital assistant Siri.

Army Researchers Identify New Way to Improve Cybersecurity (May 2, 2019)
With cybersecurity one of the nation's top security concerns and billions of people affected by breaches last year, government and businesses are spending more time and money defending against it. Researchers at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory, the Army's corporate research laboratory also known as ARL, and Towson University may have identified a new way to improve network security.

©1994-2019   |   Shodor   |   Privacy Policy   |   NSDL   |   XSEDE   |   Blue Waters   |   ACM SIGHPC   |   feedback  |   facebook   |   twitter   |   rss   |   youtube Not Logged In. Login