fastcore and fastai

Knowing enough of the foundations to get through the codebase
Lesson 7

Lesson Video:

Introduction

While this is a lesson, this particular part will be aimed as a reference sheet. I’ll explain in detail what each of these items do, but remember to always come back to here if you want a quick way to understand parts of the fastai library or fastcore library I deem “important” to conceptually understand.

Monkey Patches

These are key functionalities added when importing a class through fastcore such as

from fastcore.xtras import Path

  • Pathlib.readlines
  • Pathlib.read_json
  • Pathlib.mk_write
  • Pathlib.relpath
  • Pathlib.ls
  • Pathlib.delete

Basic Functions, Classes, and Ideas

store_attr

Stores all inputs for a function as attributes:

from fastcore.basics import store_attr
class A:
    def __init__(self, b, c):
        store_attr()

# Check they exist
a = A(1,2)
assert a.b == 1
assert a.c == 2

The L

An expanded list class that allows you to perform a variety of shorthand operations on it

from fastcore.foundation import L

my_list = L(1,2,3,3)
my_list.map(lambda x: x+1)
(#4) [2,3,4,4]
my_list.count(3)
2
my_list.sum()
9

AttrDict, obj2dict, and dict2obj

These are quick ways to create objects that rely on the AttrDict class. Essentially an object that can be both indexed like a list and act as a dictionary:

from fastcore.basics import AttrDict
from fastcore.xtras import dict2obj, obj2dict

o = {"a":0, "b":1, "c":2}
o_obj = dict2obj(o)
o_dict = obj2dict(o_obj)

o["a"], o_obj.a, o_obj["a"]
(0, 0, 0)
type(o), type(o_obj), type(o_dict)
(dict, fastcore.basics.AttrDict, dict)

docments

A way to document code next to the function parameter. Requires use of nbdev

def addition(
    a:int, # The first number
    b:int, # The second number
):
    return a+b

Generally not recommended for IDE issues, see my future python course for more on this.

Testing Suite

fastcore comes with a testing suite for assert statements in notebooks:

import fastcore.test as fasttest

fasttest.equals(0,0)
True

Generally they are very close to that of pytest:

  • test_eq
  • test_ne
  • test_close
  • test_is
  • test_shuffled
  • test_stdout
  • test_warns
  • ExceptionExpected

Transform

The core of fastai’s transforms which follow the TypeDispatch system.

General idea:

Depending on the input passed one of n functions will be called for that particular input

from fastcore.dispatch import typedispatch

@typedispatch
def my_transform(value:int):
    print("Passed an `int`!")

@typedispatch
def my_transform(value:str):
    print("Passed a `str`!")

@typedispatch
def my_transform(value:bool):
    print("Passed a `bool`!")
my_transform(1)
Passed an `int`!
my_transform("Hi!")
Passed a `str`!
my_transform(False)
Passed a `bool`!

This scaled up becomes the Transform API of fastai at its core

Pipeline

The Pipeline class is setup to iterate over a series of transforms at a particular setup with an input, such as the setups shown in previous lessons.

from fastcore.transform import Pipeline
pipe = Pipeline([TransformA, TransformB])
pipe(x)

From fastai

Previously these were all parts that came from fastcore. Now we will see examples that come from fastai

defaults

Contains fastai’s defaults and are declared all around fastai’s codebase. Extremely hard to follow

apply

Applies a function recursively to some iterable object with potential arguments:

from fastai.torch_core import apply

o = [1,2,3]
apply(lambda x: x+1, o)
[2, 3, 4]

concat

Concatentates tensors, arrays, lists, or tuples

from fastai.torch_core import concat

concat([1,2,3], (4, 5), 6)
[1, 2, 3, 4, 5, 6]

find_bs

Finds the batch size of a particular item:

from fastai.torch_core import find_bs
import torch

t = torch.rand(64, 3, 224,224)
find_bs(t)
64

Module

Removes the need to do super().__init__() on PyTorch classes:

from fastai.torch_core import Module

class MyLayer(Module):
    def __init__(self, arg1):
        self.arg1 = arg1

Other Resources

There are many, many other functions fastai and fastcore utilizes. Most of what I study from are either located in fastcore.xtras or fastai.torch_core.