{
"cells": [
{
"cell_type": "markdown",
"id": "micro-analyst",
"metadata": {},
"source": [
"# Image Segmentation with CellPose-SAM\n",
"Since Version 4 CellPose uses a variaton of the [Segment-Anything-Model](https://segment-anything.com/). \n",
"\n",
"See also\n",
"* [Cellpose-SAM preprint](https://www.biorxiv.org/content/10.1101/2025.04.28.651001v1)\n",
"* [Cellpose on github](https://github.com/MouseLand/cellpose)\n",
"* [Cellpose-SAM example notebook](https://github.com/MouseLand/cellpose/blob/main/notebooks/run_Cellpose-SAM.ipynb)\n",
"\n",
"As usual, we start with loading an example image."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "50aa598d-3943-4651-8a28-c45253c5c19f",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"Welcome to CellposeSAM, cellpose v\n",
"cellpose version: \t4.0.3 \n",
"platform: \twin32 \n",
"python version: \t3.11.11 \n",
"torch version: \t2.6.0! The neural network component of\n",
"CPSAM is much larger than in previous versions and CPU excution is slow. \n",
"We encourage users to use GPU/MPS if available. \n",
"\n",
"\n"
]
}
],
"source": [
"import cellpose"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "4799d0ae-93e1-41d9-93bb-ea0130a12612",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from cellpose import models\n",
"import stackview\n",
"import numpy as np\n",
"from skimage.data import human_mitosis\n",
"from skimage.io import imread"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "adbb027d-98d9-49c0-bc5c-de4495552b2e",
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/html": [
"
\n",
"\n",
"\n",
" \n",
" | \n",
"\n",
"\n",
"\n",
"shape | (512, 512) | \n",
"dtype | uint8 | \n",
"size | 256.0 kB | \n",
"min | 7 | max | 255 | \n",
" \n",
" \n",
" | \n",
"
\n",
"
"
],
"text/plain": [
"[[ 8 8 8 ... 63 78 75]\n",
" [ 8 8 7 ... 67 71 71]\n",
" [ 9 8 8 ... 53 64 66]\n",
" ...\n",
" [ 8 9 8 ... 17 24 59]\n",
" [ 8 8 8 ... 17 22 55]\n",
" [ 8 8 8 ... 16 18 38]]"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"image = human_mitosis()\n",
"stackview.insight(image)"
]
},
{
"cell_type": "markdown",
"id": "cordless-lebanon",
"metadata": {},
"source": [
"## Loading a pretrained model\n",
"CellPose-SAM comes with only a single model that generalizes for multiple images and channel variations."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "deadly-tunisia",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"model = models.CellposeModel(gpu=True)"
]
},
{
"cell_type": "markdown",
"id": "derived-electricity",
"metadata": {},
"source": [
"We let the model \"evaluate\" the image to produce masks of segmented nuclei."
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "c7495d0b-186d-4694-8aa5-867f98d84106",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"masks, flows, styles = model.eval(image, \n",
" batch_size=32, \n",
" flow_threshold=0.4, \n",
" cellprob_threshold=0.0,\n",
" normalize={\"tile_norm_blocksize\": 0})"
]
},
{
"cell_type": "markdown",
"id": "53de48e6-0a95-40dc-9dd7-3a032ab9d9de",
"metadata": {
"tags": []
},
"source": [
"We convert the label image to integer type because many downstream libraries expect this."
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "b07462eb-e856-49f1-bde9-ebe8127da599",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"masks = masks.astype(np.uint32)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "264b0471-dfd3-4654-a02c-c8fcdce072f4",
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/html": [
"\n",
"\n",
"\n",
" \n",
" | \n",
"\n",
"\n",
"\n",
"shape | (512, 512) | \n",
"dtype | uint32 | \n",
"size | 1024.0 kB | \n",
"min | 0 | max | 329 | n labels | 329 | \n",
" \n",
"\n",
" | \n",
"
\n",
"
"
],
"text/plain": [
"[[0 0 0 ... 3 3 3]\n",
" [0 0 0 ... 3 3 3]\n",
" [0 0 0 ... 3 3 3]\n",
" ...\n",
" [0 0 0 ... 0 0 0]\n",
" [0 0 0 ... 0 0 0]\n",
" [0 0 0 ... 0 0 0]]"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"stackview.insight(masks)"
]
},
{
"cell_type": "markdown",
"id": "88578710-3658-4248-8de6-230c05c3ca98",
"metadata": {},
"source": [
"## Exercise\n",
"Load `../../data/blobs.tif` and apply Cellpose-SAM to it."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "12febfe7-9fb8-4e23-b869-4d918bfec3c9",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "edd55427-811c-42da-9822-22707f22994a",
"metadata": {},
"source": [
"Load `../../data/membrane2d.tif` and apply Cellpose-SAM to it."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9b0de7c4-aaaf-41b5-ba4a-5364451c8271",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.11"
}
},
"nbformat": 4,
"nbformat_minor": 5
}