fastcore: An Underrated Python Library


By Hamel Husain, Staff Machine Learning Engineer at GitHub

screenshot with code



I lately launched into a journey to sharpen my python expertise: I needed to study superior patterns, idioms, and strategies. I began with studying books on superior Python, nevertheless, the data did not appear to stay with out having someplace to use it. I additionally needed the power to ask questions from an professional whereas I used to be studying — which is an association that’s onerous to search out! That’s when it occurred to me: What if I may discover an open supply challenge that has pretty superior python code and write documentation and assessments? I made a wager that if I did this it might drive me to study all the things very deeply, and the maintainers can be appreciative of my work and be keen to reply my questions.

And that is precisely what I did over the previous month! I’m happy to report that it has been probably the most environment friendly studying expertise I’ve ever skilled. I’ve found that writing documentation compelled me to deeply perceive not simply what the code does but additionally why the code works the best way it does, and to discover edge circumstances whereas writing assessments. Most importantly, I used to be in a position to ask questions after I was caught, and maintainers had been keen to commit further time understanding that their mentorship was in service of creating their code extra accessible! It seems the library I select, fastcore is a number of the most fascinating Python I’ve ever encountered as its objective and targets are pretty distinctive.

For the uninitiated, fastcore is a library on high of which many tasks are constructed on. Most importantly, fastcore extends the python programming language and strives to eradicate boilerplate and add helpful performance for frequent duties. In this weblog submit, I’m going to focus on a few of my favourite instruments that fastcore supplies, quite than sharing what I discovered about python. My objective is to pique your curiosity on this library, and hopefully inspire you to take a look at the documentation after you’re performed to study extra!


Why fastcore is attention-grabbing


  1. Get uncovered to concepts from different languages with out leaving python: I’ve at all times heard that it’s helpful to study different languages to be able to turn into a greater programmer. From a realistic viewpoint, I’ve discovered it troublesome to study different languages as a result of I may by no means use them at work. Fastcore extends python to incorporate patterns present in languages as numerous as Julia, Ruby and Haskell. Now that I perceive these instruments I’m motivated to study different languages.
  2. You get a brand new set of pragmatic instruments: fastcore contains utilities that can can help you write extra concise expressive code, and maybe remedy new issues.
  3. Learn extra concerning the Python programming language: Because fastcore extends the python programming language, many superior ideas are uncovered throughout the course of. For the motivated, it is a nice technique to see how most of the internals of python work.


A whirlwind tour by means of fastcore

Here are some issues you are able to do with fastcore that instantly caught my consideration.


Making **kwargs clear

Whenever I see a operate that has the argument **kwargs, I cringe a bit of. This is as a result of it means the API is obfuscated and I’ve to learn the supply code to determine what legitimate parameters may be. Consider the beneath instance:

def baz(a, b=2, c =3, d=4): return a + b + c

def foo(c, a, **kwargs):
    return c + baz(a, **kwargs)

<Signature (c, a, **kwargs)>

Without studying the supply code, it may be onerous for me to know that foo additionally accepts and extra parameters b and d. We can repair this with delegates:

def baz(a, b=2, c =3, d=4): return a + b + c

@delegates(baz) # this decorator will cross down key phrase arguments from baz
def foo(c, a, **kwargs):
    return c + baz(a, **kwargs)

<Signature (c, a, b=2, d=4)>

You can customise the conduct of this decorator. For instance, you’ll be able to have your cake and eat it too by passing down your arguments and likewise holding **kwargs:

@delegates(baz, hold=True)
def foo(c, a, **kwargs):
    return c + baz(a, **kwargs)

<Signature (c, a, b=2, d=4, **kwargs)>

You also can exclude arguments. For instance, we exclude argument d from delegation:

def basefoo(a, b=2, c =3, d=4): cross

@delegates(basefoo, however= ['d']) # exclude `d`
def foo(c, a, **kwargs): cross


You also can delegate between courses:

class BaseFoo:
    def __init__(self, e, c=2): cross

@delegates()# since no argument was passsed right here we delegate to the superclass
class Foo(BaseFoo):
    def __init__(self, a, b=1, **kwargs): tremendous().__init__(**kwargs)
<Signature (a, b=1, c=2)>

For extra data, learn the docs on delegates.


Avoid boilerplate when setting occasion attributes

Have you ever questioned if it was doable to keep away from the boilerplate concerned with setting attributes in __init__?

class Test:
    def __init__(self, a, b ,c): 
        self.a, self.b, self.c = a, b, c

Ouch! That was painful. Look in any respect the repeated variable names. Do I actually should repeat myself like this when defining a category? Not Anymore! Checkout store_attr:

class Test:
    def __init__(self, a, b, c): 
t = Test(5,4,3)
assert t.b == 4

You also can exclude sure attributes:

class Test:
    def __init__(self, a, b, c): 
t = Test(5,4,3)
assert t.b == 4
assert not hasattr(t, 'c')

There are many extra methods of customizing and utilizing store_attr than I highlighted right here. Check out the docs for extra element.

P.S. you may be considering that Python dataclasses additionally can help you keep away from this boilerplate. While true in some circumstances, store_attr is extra versatile.1

1. For instance, store_attr doesn’t depend on inheritance, which suggests you will not get caught utilizing a number of inheritance when utilizing this with your personal courses. Also, not like dataclasses, store_attr doesn’t require python 3.7 or greater. Furthermore, you need to use store_attr anytime within the object lifecycle, and in any location in your class to customise the conduct of how and when variables are saved.


Avoiding subclassing boilerplate

One factor I hate about python is the __super__().__init__() boilerplate related to subclassing. For instance:

class Father or motherClass:
    def __init__(self): self.some_attr = 'howdy'
class BabyClass(Father or motherClass):
    def __init__(self):

cc = BabyClass()
assert cc.some_attr == 'howdy' # solely accessible b/c you used tremendous

We can keep away from this boilerplate through the use of the metaclass PrePostInitMeta. We outline a brand new class known as NewParent that may be a wrapper across the Father or motherClass:

class NewParent(Father or motherClass, metaclass=PrePostInitMeta):
    def __pre_init__(self, *args, **kwargs): tremendous().__init__()

class BabyClass(NewParent):
    def __init__(self):cross
sc = BabyClass()
assert sc.some_attr == 'howdy' 


Type Dispatch

Type dispatch, or Multiple dispatch, permits you to change the best way a operate behaves primarily based upon the enter sorts it receives. This is a outstanding function in some programming languages like Julia. For instance, it is a conceptual instance of how a number of dispatch works in Julia, returning totally different values relying on the enter sorts of x and y:

collide_with(x::Asteroid, y::Asteroid) = ... 
# take care of asteroid hitting asteroid

collide_with(x::Asteroid, y::Spaceship) = ... 
# take care of asteroid hitting spaceship

collide_with(x::Spaceship, y::Asteroid) = ... 
# take care of spaceship hitting asteroid

collide_with(x::Spaceship, y::Spaceship) = ... 
# take care of spaceship hitting spaceship

Type dispatch will be particularly helpful in knowledge science, the place you would possibly permit totally different enter sorts (i.e. Numpy arrays and Pandas dataframes) to a operate that processes knowledge. Type dispatch permits you to have a standard API for capabilities that do related duties.

Unfortunately, Python doesn’t help this out-of-the field. Fortunately, there’s the @typedispatch decorator to the rescue. This decorator depends upon kind hints to be able to route inputs the proper model of the operate:

def f(x:str, y:str): return f'{x}{y}'

def f(x:np.ndarray): return x.sum()

def f(x:int, y:int): return x+y

Below is an illustration of kind dispatch at work for the operate f:

There are limitations of this function, in addition to different methods of utilizing this performance that you’ll be able to examine right here. In the method of studying about typed dispatch, I additionally discovered a python library known as multipledispatch made by Mathhew Rocklin (the creator of Dask).

After utilizing this function, I’m now motivated to study languages like Julia to find what different paradigms I may be lacking.


A greater model of functools.partial

functools.partial is a good utility that creates capabilities from different capabilities that allows you to set default values. Lets take this operate for instance that filters a listing to solely comprise values >= val:

test_input = [1,2,3,4,5,6]
def f(arr, val): 
    "Filter a list to remove any values that are less than val."
    return [x for x in arr if x >= val]

f(test_input, 3)

You can create a brand new operate out of this operate utilizing partial that units the default worth to five:

filter5 = partial(f, val=5)

One drawback with partial is that it removes the unique docstring and replaces it with a generic docstring:

'partial(func, *args, **key phrases) - new operate with partial applicationn    of the given arguments and key phrases.n'

fastcore.utils.partialler fixes this, and makes positive the docstring is retained such that the brand new API is clear:

filter5 = partialler(f, val=5)
'Filter a listing to take away any values which are lower than val.'


Composition of capabilities

A way that’s pervasive in practical programming languages is operate composition, whereby you chain a bunch of capabilities collectively to attain some type of outcome. This is very helpful when making use of numerous knowledge transformations. Consider a toy instance the place I’ve three capabilities: (1) Removes components of a listing lower than 5 (from the prior part) (2) provides 2 to every quantity (3) sums all of the numbers:

def add(arr, val): return [x + val for x in arr]
def arrsum(arr): return sum(arr)

# See the earlier part on partialler
add2 = partialler(add, val=2)

rework = compose(filter5, add2, arrsum)

But why is this handy? You would possibly me considering, I can accomplish the identical factor with:


You are usually not mistaken! However, composition provides you a handy interface in case you wish to do one thing like the next:

def match(x, transforms:record):
    "fit a model after performing transformations"
    x = compose(*transforms)(x)
    y = [np.mean(x)] * len(x) # its a dumb mannequin.  Don't decide me
    return y

# filters out components < 5, provides 2, then predicts the imply
match(x=[1,2,3,4,5,6], transforms=[filter5, add2])

For extra details about compose, learn the docs.


A extra helpful __repr__

In python, __repr__ helps you get details about an object for logging and debugging. Below is what you get by default whenever you outline a brand new class. (Note: we’re utilizing store_attr, which was mentioned earlier).

class Test:
    def __init__(self, a, b=2, c=3): store_attr() # `store_attr` was mentioned beforehand
<__main__.Test at 0x7ffcd766cee0>

We can use basic_repr to rapidly give us a extra smart default:

class Test:
    def __init__(self, a, b=2, c=3): store_attr() 
    __repr__ = basic_repr('a,b,c')


Monkey Patching With A Decorator

It will be handy to monkey patch with a decorator, which is very useful whenever you wish to patch an exterior library you’re importing. We can use the decorator @patch from fastcore.basis together with kind hints like so:

class MyClass(int): cross  

def func(self:MyClass, a): return self+a

mc = MyClass(3)

Now, MyClass has an extra methodology named func:

Still not satisfied? I’ll present you one other instance of this sort of patching within the subsequent part.


A greater pathlib.Path

When you see these extensions to pathlib.path you will not ever use vanilla pathlib once more! Quite a few further strategies have been added to pathlib, corresponding to:

  • Path.readlines: similar as with open('somefile', 'r') as f: f.readlines()
  • Path.learn: similar as with open('somefile', 'r') as f: f.learn()
  • saves file as pickle
  • Path.load: masses pickle file
  • exhibits the contents of the trail as a listing.
  • and so forth.

Read extra about this right here. Here is an illustration of ls:

from fastcore.utils import *
from pathlib import Path
p = Path('.') # you do not get this with vanilla Pathlib.Path!!
(#7) [Path('2020-09-01-fastcore.ipynb'),Path(''),Path('fastcore_imgs'),Path('2020-02-20-test.ipynb'),Path('.ipynb_checkpoints'),Path('2020-02-21-introducing-fastpages.ipynb'),Path('my_icons')]

Wait! What’s happening right here? We simply imported pathlib.Path – why are we getting this new performance? Thats as a result of we imported the fastcore.utils module, which patches this module through the @patch decorator mentioned earlier. Just to drive the purpose house on why the @patch decorator is helpful, I’ll go forward and add one other methodology to Path proper now:

def enjoyable(self:Path): return "This is fun!"


That is magical, proper? I do know! That’s why I’m writing about it!


An Even More Concise Way To Create Lambdas

Self, with an uppercase S, is an much more concise technique to create lambdas which are calling strategies on an object. For instance, let’s create a lambda for taking the sum of a Numpy array:

f = lambda a: a.sum()
assert f(arr) == 15

You can use Self in the identical method:

f = Self.sum()
assert f(arr) == 15

Let’s create a lambda that does a groupby and max of a Pandas dataframe:

import pandas as pd
df=pd.DataFrame({'Some Column': ['a', 'a', 'b', 'b', ], 
                 'Another Column': [5, 7, 50, 70]})

f = Self.groupby('Some Column').imply()
Another Column
Some Column
a 6
b 60


Read extra about Self in the docs).


Notebook Functions

These are easy however useful, and can help you know whether or not or not code is executing in a Jupyter Notebook, Colab, or an Ipython Shell:

from fastcore.imports import in_notebook, in_colab, in_ipython
in_notebook(), in_colab(), in_ipython()

This is helpful if you’re displaying sure sorts of visualizations, progress bars or animations in your code that you could be wish to modify or toggle relying on the atmosphere.


A Drop-In Replacement For List

You may be fairly proud of Python’s record. This is a type of conditions that you do not know you wanted a greater record till somebody confirmed one to you. Enter L, a listing like object with many further goodies.

The greatest method I can describe L is to fake that record and numpy had a reasonably child:

outline a listing (take a look at the good __repr__ that exhibits the size of the record!)

Shuffle a listing:

p = L.vary(20).shuffle()
(#20) [8,7,5,12,14,16,2,15,19,6...]

Index into a listing:

L has smart defaults, for instance appending a component to a listing:

There is far more L has to supply. Read the docs to study extra.


But Wait … There’s More!


There are extra issues I wish to present you about fastcore, however there is no such thing as a method they might fairly match right into a weblog submit. Here is a listing of a few of my favourite issues that I did not demo on this weblog submit:



The Utilites part comprise many shortcuts to carry out frequent duties or present an extra interface to what normal python supplies.



The Multiprocessing part extends python’s multiprocessing library by providing options like:

  • progress bars
  • skill to pause to mitigate race situations with exterior providers
  • processing issues in batches on every employee, ex: if in case you have a vectorized operation to carry out in chunks


Functional Programming

The practical programming part is my favourite a part of this library.

  • maps: a map that additionally composes capabilities
  • mapped: A extra sturdy map
  • using_attr: compose a operate that operates on an attribute



Transforms is a group of utilities for creating knowledge transformations and related pipelines. These transformation utilities construct upon most of the constructing blocks mentioned on this weblog submit.


Further Reading

It needs to be famous that you need to learn the principal web page of the docs first, adopted by the part on assessments to completely perceive the documentation.


Shameless plug: fastpages

This weblog submit was written fully in a Jupyter Notebook, which GitHub routinely transformed into to a weblog submit! Sound attention-grabbing? Check out fastpages.

Bio: Hamel Husain is a Staff Machine Learning Engineer @ GitHub.

Original. Reposted with permission.



Source hyperlink

Write a comment