Nucleotide Transformer — DNA foundation models by InstaDeep and BioNTech for genomic sequence analysis. Pre-trained transformer models (50M to 2.5B parameters) on 3,202 diverse genomes generate embedd
Use with AI
Install the MCP server or CLI to instantly fetch Nucleotide Transformer documentation:
Install command
claude mcp add biocontext7 -- npx @biocontext7/mcpOr share this page: biocontext7.com/tools/nucleotide-transformer
Evo — genomic foundation model for DNA sequence modeling at single-nucleotide resolution. Uses StripedHyena architecture (7B parameters) trained on 2.7M prokaryotic and phage genomes (OpenGenome datas
3 shared topics • 1 shared operation
Geneformer — transformer-based foundation model pretrained on ~30 million single-cell transcriptomes for context-specific gene network analysis. Supports fine-tuning for cell type classification, gene
3 shared topics • 1 shared operation
GATK VQSR (Variant Quality Score Recalibration) — machine learning-based variant filtering for GATK Best Practices germline pipelines. Trains a Gaussian mixture model on truth/training resource datase
2 shared topics • 1 shared operation
JAX — Google's high-performance numerical computing library combining NumPy API with automatic differentiation (grad, value_and_grad), just-in-time XLA compilation (jit), automatic vectorization (vmap
2 shared topics • 1 shared operation
Loom — HDF5-based file format and Python library (loompy) for storing and accessing large single-cell RNA-seq datasets. Supports sparse/dense matrices, row attributes (genes/features), column attribut
2 shared topics • 1 shared operation