![](/rp/kFAqShRrnkQMbH6NYLBYoJ3lq9s.png)
[2203.03540] GatorTron: A Large Clinical Language Model to …
2022年2月2日 · GatorTron models scale up the clinical language model from 110 million to 8.9 billion parameters and improve 5 clinical NLP tasks (e.g., 9.6% and 9.5% improvement in accuracy for NLI and MQA), which can be applied to medical AI systems to …
GitHub - uf-hobi-informatics-lab/GatorTron: all scripts used in ...
An evaluation of Gatortron using a de-identification task (i.e., detect and remove 18 personal identifiers such as names and birth dates from protected health information) showed that the newly trained Gatortron language model, achieved state-of-the-art performance.
UFNLP/gatortron-base - Hugging Face
Developed by a joint effort between the University of Florida and NVIDIA, GatorTron-Base is a clinical language model of 345 million parameters, pre-trained using a BERT architecure implemented in the Megatron package (https://github.com/NVIDIA/Megatron-LM).
UFNLP/gatortronS - Hugging Face
Developed by a joint effort between the University of Florida and NVIDIA, GatorTronS is a clinical language model of 345 million parameters, pre-trained using a BERT architecure implemented in the Megatron package (https://github.com/NVIDIA/Megatron-LM). GatorTronS is pre-trained using a dataset consisting of:
A large language model for electronic health records
2022年12月26日 · GatorTron models scale up the clinical language model from 110 million to 8.9 billion parameters and improve five clinical NLP tasks (e.g., 9.6% and 9.5% improvement in accuracy for NLI and...
GatorTron: A Large Language Model for Clinical Natural Language ...
2022年3月18日 · The proposed GatorTron models performed remarkably better in much complex clinical NLP tasks such as natural language inference (9.6% and 7.5% improvements) and question answering (9.5% and 7.77% improvements) compared with existing smaller clinical transformer models (i.e., BioBERT and ClinicalBERT), demonstrating the potential of large ...
2022年2月27日 · GatorTron scaled up transformer-based clinical language models to a size of 8.9 billion parameters and achieved state-of-the-art performance on 5 clinical NLP tasks of different linguistic levels targeting various healthcare information documented in unstructured electronic health records (EHRs).
GatorTron is now the largest transformer model in the clinical domain that scaled up from the previous 110 million to 8.9 billion parameters and achieved state-of-the-art performance on the 5 clinical NLP tasks targeting various healthcare
GatorTron: A Large Clinical Language Model to Unlock Patient ...
2022年2月2日 · GatorTron models scale up the clinical language model from 110 million to 8.9 billion parameters and improve 5 clinical NLP tasks (e.g., 9.6% and 9.5% improvement in accuracy for NLI and MQA), which can be applied to medical AI systems to …
University of Florida Health, NVIDIA develop artificial intelligence ...
2021年4月8日 · GatorTron™ was pre-trained on HiPerGator AI, UF’s own NVIDIA DGX SuperPOD AI supercomputer, in a mere seven days. It is the first natural language processing clinical model of its scale in the world, setting the stage for myriad downstream medical applications that were previously unachievable.
- 某些结果已被删除一些您可能无法访问的结果已被隐去。显示无法访问的结果