Use when working with PubMedBERT — a BERT model pretrained from scratch on PubMed abstracts — for biomedical NLP tasks including named entity recognition, relation extraction, question answering, and
Use with AI
Install the MCP server or CLI to instantly fetch PubMedBERT documentation:
Install command
claude mcp add biocontext7 -- npx @biocontext7/mcpOr share this page: biocontext7.com/tools/pubmedbert
Hugging Face Transformers — state-of-the-art machine learning library for natural language processing, computer vision, audio, and multimodal tasks. Provides pretrained models (BERT, GPT-2, T5, Llama,
3 shared topics • 1 shared operation
AntiFold — antibody-specific inverse folding model built on ESM-IF1, fine-tuned on antibody structures from SAbDab and OAS databases. Predicts residue log-likelihoods for antibody variable domains (IM
3 shared topics
BioGPT — domain-specific generative pre-trained transformer for biomedical text mining and generation, pre-trained on 15M+ PubMed abstracts. Supports biomedical text generation, relation extraction, q
3 shared topics
Caduceus — bi-directional DNA foundation model built on the Mamba state-space architecture for genomic sequence modeling. Provides bi-directional long-range sequence understanding with reverse complem
3 shared topics
OGER (OntoGene Entity Recogniser) — dictionary-based biomedical named entity recognition (NER) tool that annotates text using ontology resources including UniProt, MeSH, CTD, CRAFT, and ChEBI. Accepts
3 shared topics