Using SAM2 with Label Studio for Image Annotation
Segment Anything 2, or SAM 2, is a model released by Meta in July 2024. An update to the original Segment Anything Model, SAM 2 provides even better object segmentation for both images and video. In this guide, we’ll show you how to use SAM 2 for better image labeling with label studio.
Click on the image below to watch our ML Evangelist Micaela Kaplan explain how to link SAM 2 to your Label Studio Project. You’ll need to follow the instructions below to stand up an instance of SAM2 before you can link your model!
Note that as of 8/1/2024, SAM2 only runs on GPU.
Running from source
- To run the ML backend without Docker, you have to clone the repository and install all dependencies using pip:
git clone https://github.com/HumanSignal/label-studio-ml-backend.git
cd label-studio-ml-backend
pip install -e .
cd label_studio_ml/examples/segment_anything_2_image
pip install -r requirements.txt
Download
segment-anything-2repo into the root directory. Install SegmentAnything model and download checkpoints using the official Meta documentationThen you can start the ML backend on the default port
9090:
cd ../
label-studio-ml start ./segment_anything_2_image
- Connect running ML backend server to Label Studio: go to your project
Settings -> Machine Learning -> Add Modeland specifyhttp://localhost:9090as a URL. Read more in the official Label Studio documentation.
Running with Docker (coming soon)
- Start Machine Learning backend on
http://localhost:9090with prebuilt image:
docker-compose up
- Validate that backend is running
$ curl http://localhost:9090/
{"status":"UP"}
- Connect to the backend from Label Studio running on the same host: go to your project
Settings -> Machine Learning -> Add Modeland specifyhttp://localhost:9090as a URL.
Configuration
Parameters can be set in docker-compose.yml before running the container.
The following common parameters are available:
DEVICE- specify the device for the model server (currently onlycudais supported,cpuis coming soon)MODEL_CONFIG- SAM2 model configuration file (sam2_hiera_l.yamlby default)MODEL_CHECKPOINT- SAM2 model checkpoint file (sam2_hiera_large.ptby default)BASIC_AUTH_USER- specify the basic auth user for the model serverBASIC_AUTH_PASS- specify the basic auth password for the model serverLOG_LEVEL- set the log level for the model serverWORKERS- specify the number of workers for the model serverTHREADS- specify the number of threads for the model server
Customization
The ML backend can be customized by adding your own models and logic inside the ./segment_anything_2 directory.
