nanoBERT

nanoBERT is a nanobody-specific BERT model trained on 10 million INDI VHH sequences for masked-residue prediction and sequence representation. The API provides CPU-only, batched inference (up to 32 sequences, length ≤154 AAs) for encoding (mean and per-residue embeddings, logits), sequence infilling using "*" masks, and log-probability scoring. Typical uses include mapping nanobody mutational feasibility, ranking variants by model nativeness, and supplying embeddings for downstream stability or developability models.

antibodyembeddingsbertpredictiongenerationlanguage-modelembeddingnanobody

Capabilities

Predictor
Encoder
Explainer
Generator
Classifier
Similarity

Accelerate yourLead generation

BioLM offers tailored AI solutions to meet your experimental needs. We deliver top-tier results with our model-agnostic approach, powered by our highly scalable and real-time GPU-backed APIs and years of experience in biological data modeling, all at a competitive price.

CTA

We speak the language of bio-AI

© 2022 - 2025 BioLM. All Rights Reserved.