Geneformer — transformer-based foundation model pretrained on ~30 million single-cell transcriptomes for context-specific gene network analysis. Supports fine-tuning for cell type classification, gene
Use with AI
Install the MCP server or CLI to instantly fetch Geneformer documentation:
Install command
claude mcp add biocontext7 -- npx @biocontext7/mcpOr share this page: biocontext7.com/tools/geneformer
Evo — genomic foundation model for DNA sequence modeling at single-nucleotide resolution. Uses StripedHyena architecture (7B parameters) trained on 2.7M prokaryotic and phage genomes (OpenGenome datas
3 shared topics • 1 shared operation
Nucleotide Transformer — DNA foundation models by InstaDeep and BioNTech for genomic sequence analysis. Pre-trained transformer models (50M to 2.5B parameters) on 3,202 diverse genomes generate embedd
3 shared topics • 1 shared operation
BioGPT — domain-specific generative pre-trained transformer for biomedical text mining and generation, pre-trained on 15M+ PubMed abstracts. Supports biomedical text generation, relation extraction, q
2 shared topics • 2 shared operations
JAX — Google's high-performance numerical computing library combining NumPy API with automatic differentiation (grad, value_and_grad), just-in-time XLA compilation (jit), automatic vectorization (vmap
2 shared topics • 2 shared operations
PyTorch — open-source deep learning framework for building and training neural networks in biology and life sciences. Provides tensor computation with GPU acceleration, automatic differentiation (auto
2 shared topics • 2 shared operations