Resources are allocated for the new period
Constant development in technology and digital working methods means that more and more scientific disciplines need high-performance computing and storage services to conduct research. With nearly 600 projects and 2000 users on the national systems, we are proud to contribute to ground-breaking scientific results and knowledge that benefits society. We are pleased to once again welcome new and old users to the national supercomputers and storage facilities for a new allocation period.
We wish all researchers good luck with their activities on the national e-infrastructure and look forward to gaining scientific results and new knowledge.
The banner image is generated by AI on this request: create a futuristic computer-based research environment with researchers working on computers.
GPT on the national services
New this application period is the increased number of inquiries about resources relevant to GPT model training and the demand for GPU hours. Generative Pre-trained Transformers, or GPT model training, fall under the language technology research field. The collective demand from these research fields alone constituted 626 000 GPU hours.
For those unfamiliar with GPU hours, one GPU hour is the total wall clock time used by an arbitrary number of GPUs that sum to one hour. For example, two GPUs working simultaneously for 30 minutes on a task will amount to one GPU hour. These GPU hours will mainly be served by LUMI, which delivers its GPU compute capacity from AMD MI250X cards.
Unprecedented capacity for Machine Learning
Natural Language Processing (NLP) is the sub-discipline of Computer Science that enables data systems to "make sense" of human language. In a nutshell, the language models are very large artificial neural networks, ranging from around 1 billion to hundreds of billions of parameters, scalar values that determine the flow of information through the network.
Parameters are learned from natural language training data, where LLM (large language models) training, evaluation, and deployment all call for large-scale GPU compute resources.
Training a comparatively small model like the Norwegian BERT requires about 10,000 GPU hours, while the compute costs for a massive model like ChatGPT remain unknown.
Professor and Project Leader Stephan Oepen and his colleagues at the University of Oslo get 375 000 GPU hours on LUMI to continue training their models. Read more about the project: Unprecedented capacity for machine learning.
High expectations for the new NIRD
Our largest users of the national storage services regarding storage quota are the data-driven fields of Climate and Earth Science. There are, however, plenty of projects from other disciplines, too. We are looking forward to enhancing value for all our researchers with the new generation NIRD storage infrastructure during the coming periods. With a capacity of 32 petabytes (PB) and a future expansion capacity of up to 70 petabytes, if needed, we are much better equipped to manage our researchers’ growing needs. Not only for storage but also to provide access to new and future-oriented services for all scientific disciplines that require resources for secure storage, processing and publication of research data.
Here is the list of the top 3 storage projects allocated by the RFK:
1. 1800 terabytes (TB): Storage for nationally coordinated NorESM experiments
Project Leader Mats Bentsen, NORCE
2. 1200 TB: Storage for INES - Infrastructure for Norwegian Earth System modelling
Project Leader Mats Bentsen, NORCE
3. 1050 TB: Bjerknes Climate Prediction Unit
Project Leader Noel Keenlyside, University of Bergen
Volcanic eruptions and viking society
Winter has taken hold in Norway this year as we are experiencing troubled times on a global scale. Hopefully, it is not the Fimbulwinter that is upon us - which, according to Norse mythology, occurs before Ragnarok.
Research has linked the mythological winter to a climate disaster 1500 years ago. Within the VIKINGS project, researchers at the Department of Geosciences at the University of Oslo aim to understand how volcanic eruptions and climate change shaped the early history of Europe and the Viking era.
The multi-disciplinary research of the VIKINGs project, led by Professor Kristin Krüger, will receive 2.2 million CPU hours and 450 terabytes storage on brand new NIRD for the upcoming period to keep investigating the climate of the intriguing historic period of the Vikings.
Read more: Volcanic eruptions and viking society
The demand for CPU hours keeps increasing
The size of computing projects in terms of consumed CPU hours keeps increasing every year and thus underlining the importance of equipping researchers with national services of high quality. We are happy to share that we commenced the acquiring process for Norway´s next national supercomputer, which is expected to be set into production next year.
Like previous periods, Earth Sciences and Physics are by far the largest consumers of computer time on the national HPC services.
Here is the top 3 list of the largest HPC projects for the upcoming period:
1. 115 million CPU hours: Solar Atmospheric Modelling,
Project Leader Mats Carlson, Rosseland Centre for Solar Physics, University of Oslo
2. 70 million CPU hours: Combustion of hydrogen blends in gas turbines at reheat condition
Project Leader Andrea Gruber, SINTEF
3. 47 million CPU hours: Quantum chemical studies of molecular properties and energetics of large and complex systems,
Project Leader Trygve Helgaker, the Department of Chemistry, University of Oslo
Spider venom`s precision strike
Another fascinating project on the national computing service uses HPC to shed light on spiders, and how spider toxins target the cell membranes of their prey.
Luckily, supercomputer Betzy is not afraid of spiders. Professor and Project Leader Nathalie Reuter, Senior Engineer Emmanuel Moutoussamy and fellow researchers at the University of Bergen use her ample power for molecular dynamics simulations to learn more. During the coming period, they will receive 10 mill CPU hours on Betzy to continue this interesting research.
The venom of recluse spiders contains an enzyme that destroys the cell membrane of their prey. However, different prey has different membrane compositions, making the toxins' lipid recognition mechanism relevant for their choice of prey.
Read more: Spider venom's precision strike: How toxins target cell membranes of their prey
Easy and secure prosessing of sensitive data
Sensitive Data Services (TSD) is a service delivered by the University of Oslo and provided as a national services by Sigma2.
Brain health with Lifebrain
What if we could prevent mental diseases and neurodegenerative disorders? Our mental health considerably impacts how we feel in everyday life and can even delay the development of dementia when we get old. On the Sensitive Data Services, TSD, we have a project called Lifebrain, led by Inge Amlien at the University of Oslo. The project aims to advance our knowledge of the risk and protective factors impacting brain health.
Amlien and his fellow researchers use Sigma2 and the Sensitive Data Service (TSD) to enable cooperation with researchers at 13 international partner sites across 8 European countries. More about Brain health with Lifebrain