What is it about?
Collecting and analyzing vast amounts of data is often necessary for scientific pursuits, such as the sPHENIX experiment at the Relativistic Heavy Ion Collider. However, the sheer size of this data, measured in Terabits per second, can pose a significant challenge when it comes to storing it all. We, a team of interdisciplinary researchers at Brookhaven National Lab, are working towards an AI-directed solution to this problem. By utilizing compression algorithms based on a deep learning paradigm called a bicephalous convolutional autoencoder (BCAE), we can now compress data with high throughput and efficiently with minimal loss of any essential information. This new technique could open doors to exciting possibilities, helping to expand our understanding of the universe.
Featured Image
Photo by Joshua Sortino on Unsplash
Why is it important?
Lossy data compression is an important area of study for reducing scientific data. Although there are many existing compression algorithms available for collision data, they might not be suitable for our particular use case. However, combining neural networks with self-supervised learning can offer us a more flexible approach that can be tailored to our specific requirements. Our research is significant in demonstrating the potential of customized deep neural networks to meet the strict demands of scientific experiments, such as high accuracy and throughput.
Perspectives
Read the Original
This page is a summary of: Fast 2D Bicephalous Convolutional Autoencoder for Compressing 3D Time Projection Chamber Data, November 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3624062.3625127.
You can read the full text:
Resources
Contributors
The following have contributed to this page