A strange bug when using fastai library with Weights & Biases

When I was using fastai library along with weights & biases callback to track my model training, I noticed a strange error when inferencing the same models created with fast.ai library. Let’s figure out the issue …
fastai
Author

Kurian Benoy

Published

July 26, 2022

After attending an introduction to using weights & biases along with fastai session conducted by Thomas Capaballe. I was excited to use weights and biases library along with a few of my hobby projects. I was working on training an Image classification models on the Kaggle competition dataset Petal to Metal.

image info

In general, whenever I am passing a code to any fastai learner objects with callback. I usually directly pass it along with vision_learner as shown in below code.

arch = "convnext_tiny_in22k"
learn = vision_learner(
    data,
    arch,
    metrics=error_rate,
    cbs=[
        WandbCallback(log_preds=False, log_model=True),
        SaveModelCallback(monitor="accuracy"),
    ],
)
learn.fine_tune(5)

I exported this model, as I was trying to create a hugging face spaces to identify various flowers species.

learn.export()

Now I went ahead creating the inference code with requirements for this model. This is when I noticed that the model exported requires wandb library to run the inference code. I was totally surprised, why it was happening at first.

Why this annoying behaviour?

It’s because when passing the callbacks to Learner class or it’s variants like in case of computer vision fastai uses vision_learner class makes it stick around. In my case, I don’t want the callback to hang around the Learner class forever, as it’s just for training job monitoring only.

After a bit of googling, I found this solution from one of the forum posts written by Wayde Gilliam.

Important

Instead of adding your callback to Learner … if it is simply used for training, just include it in your call(s) to fit or fit_one_cycle. As the callback is no longer associated to your Learner, they won’t interfere with your call to get_preds().

Original answer.

So inorder to fix it, I just passed the callbacks I am using directly with fine_tune method directly. Let’s check the code to pass callbacks this way.

arch = "convnext_tiny_in22k"
learn = vision_learner(data, arch, metrics=[accuracy, error_rate])
learn.fine_tune(
    5,
    cbs=[
        WandbCallback(log_preds=False, log_model=True),
        SaveModelCallback(monitor="accuracy"),
    ],
)

Hence I learned this valuable lesson, which fixed my bug in inferencing code for huggingface spaces which I was creating.

Zach Mueller also confirmed this is the case.