Transformers at Scale for Science
DescriptionTransformers and other large language models have shown impressive capabilities as 'foundation' models for domains such as natural language processing and computer vision. Requiring huge amounts of scalable compute, self-supervised training on large datasets is leveraged to develop models applicable to a variety of specialized tasks. Recent efforts in areas such as bioinformatics and protein folding indicate the significant potential for Transformer models in domain science applications. In this session, presenters and attendees have the opportunity to discuss new algorithms, software, or hardware for training Transformers on large domain science datasets and novel ideas for applying Transformers in this space.
Session Leader
Additional Session Leaders
Event Type
Birds of a Feather
TimeThursday, 17 November 202212:15pm - 1:15pm CST
LocationD168
TP
XO/EX
Archive
view