A password will be e-mailed to you.

Kyoto University in Japan loses 77TB of research data due to flaw in its supercomputer

A problem with the backup system of Kyoto University’s Hewlett-Packard supercomputer resulted in the loss of around 77TB of research data. Between December 14 and 16, 2021, the event happened, resulting in the elimination of 34 million files from 14 research groups, as well as the backup file, from the system.

Courtesy: Reuters

The university discovered that the work of four of the affected groups could no longer be restored after analysing the consequences of the loss. According to sources, the impacted users were notified of the data loss through email.According to Bleeping Computer, supercomputer research costs hundreds of dollars each hour, despite the lack of data on the type of work that was lost. Kyoto is one of Japan’s most important research institutions, with the second-largest government grant investment in scientific research.

For the time being, the backup procedure has been disabled to prevent additional data loss. The backup system has also been decommissioned, with plans to enhance it before returning it in January 2022. In addition to full backup mirrors, the institution wants to keep incremental backups, which only cover files that have changed since the last backup.

Kyoto is renowned for its scientific and research expertise, particularly in the field of chemistry, where it ranks fourth in the world. Biology, pharmacology, immunology, material science, and physics are among areas where the university contributes. The Riken Center for Computational Science in Kobe presently operates the world’s most powerful supercomputer, known as Fugaku. Fujitsu’s Fugaku exascale system has a computational performance of 442 PFLOPS, followed by IBM’s Summit supercomputer, which has a much lower computational performance of 148 PFLOPS and is ranked second on the global list. Fugaku cost $1.2 billion to build and has been utilized for COVID-19 research, diagnostics, pharmaceuticals, and virus transmission simulations so far.

As Bleeping Computer pointed out in its paper, supercomputing development should not be dismissed lightly. The hourly operation could cost hundreds of dollars depending on the importance of the investigation. Kyoto University has released additional details on the supercomputer storage data loss following an unanticipated incident earlier this month.

The most intriguing aspect of supercomputing is that it performs tasks that standard computing cannot. The system’s computer processes are carried out using exceedingly complicated mathematical calculations. Experts are also investigating its applications by incorporating supercomputers into physics, climate change, and other disciplines of inquiry.

Astronomers used the ATERU II supercomputer to model 4,000 universes earlier this year, according to Tech Times. The National Astronomical Observatory of Japan (NAOJ) researchers used mapping models to see the early state of the universe.

In February, the same technology newspaper published another article on how the world’s fastest computer could anticipate tsunamis in real time. Experts were able to generate 20,000 different natural disaster scenarios using artificial intelligence (AI). Because Japan is located in the Pacific Ring of Fire, specialists must intervene in this issue. This manner, they might alert people ahead of time if a tidal wave is on its way to a certain location.

Comments

comments

No more articles
Send this to a friend