Have a personal or library account? Click to login
mPickle: Pickle for MicroPython Cover
Open Access
|Jan 2026

Figures & Tables

jors-14-587-g1.png
Figure 1

pickle serialization and deserialization process.

Table 1

Summary of main methods used by CPython’s pickle module.

METHODDESCRIPTIONARGUMENTSRETURN VALUE
__reduce__Provides instructions on how to reconstruct an object during pickling. Returns a tuple that describes how to recreate the object. Typically contains a callable (e.g., function), arguments for that callable, and optionally the object’s state.NoneTuple of (callable, args, state, optionally other items)
__reduce_ex__Similar to __reduce__, but allows different pickling protocols. Takes a protocol version to decide how to serialize an object. Used to add compatibility for different pickling versions.protocol (int) – specifies the pickle protocol versionSame as __reduce__: a tuple of (callable, args, state, optionally other items)
__new__Used to create a new instance of a class without initializing it. Pickling uses __new__ to create an instance from a serialized state. It is called during unpickling before initialization.cls (type), followed by optional arguments to initialize an instanceNew instance of the class
__setstate__Allows for setting the state of an object during unpickling. Used to restore the object’s internal state from pickled data, called after the object is created via __new__.state (dict, tuple, or other serialized data) – the state of the objectNone (modifies the instance in place)
jors-14-587-g2.png
Figure 2

mPickle architecture and module mapping. Arrows highlight the direction of data movement: from MicroPython to Python (dashed line), and from Python to MicroPython (dotted line). Continuous lines highlight the interaction to internally register functions and modules mapping.

jors-14-587-g3.png
Figure 3

mPickle examples workflow. Data objects can be serialized (dumped) in MicroPython using mPickle and deserialized (loaded) in CPython using native Pickle, and vice versa.

jors-14-587-g4.png
Figure 4

Comparison of serialized output sizes for JSON (dotted lines) and mPickle (continuous lines). Dots are the break-even points. The zoom-in plots highlight the break-even points regions, where mPickle becomes more efficient than JSON. (a) and (b) show the output sizes for lists and dictionaries, respectively. Finally, lower values are better.

Table 2

Break-Even points and space saving (percentage).

STRUCTURE TYPEDATA TYPEBREAK-EVEN STRUCTURE SIZESAVING AT BREAK-EVENMAX SAVINGS (PERCENTAGE)MAX SAVINGS STRUCTURE SIZE (ELEMENTS)
dictfloat90.579.021000
listfloat190.468.912000
dictinteger54.8123.65125
listinteger64.2349.342000
dictmixed61.5215.38125
listmixed91.6824.02000
dictsequential91.8523.12250
listsequential144.3454.74250
jors-14-587-g5.png
Figure 5

Space saving in percentage of mPickle vs JSON. Negative values mean JSON is more efficient, while positive values mean mPickle is better. (a) and (b) show the space saving for lists and dictionaries, respectively. The dashed line indicates the break-even boundary. Finally, higher values are better.

Table 3

Main reuse scenarios enabled by mPickle across CPython (CP) and MicroPython (MP), mapped to the examples included.

REUSE SCENARIOWHAT IS SERIALIZEDDIRECTIONWHERE SHOWN
Core interoperabilitySimple & built-in data types, nested structuresMP ↔ CPEx. 0 and 1
Custom objectsUser-defined classes and object stateMP ↔ CPEx. 2
Embedded systemsConfigurations, structured runtime state, parametersMP ↔ CPEx. 0, 1 and 2
Education and prototypingDidactic examples, small projects, structured data exchangeMP ↔ CPythonEx. 0, 1 and 2
Numerical computingNumPy arrays ↔ ulab arrays (via mapping)MP ↔ CPEx. 3
Edge AI artifactsModel weights/layer params as dict of arraysCP → MP (typical)Ex. 4
IoT integrationSensor readings, batched records, device logsMP/CP → serverEx. 5
Data science/experimentationStructured objects and arrays moved to embedded targetsCP → MP (common)Ex. 3, 4, and 5
DOI: https://doi.org/10.5334/jors.587 | Journal eISSN: 2049-9647
Language: English
Submitted on: Jun 12, 2025
|
Accepted on: Dec 30, 2025
|
Published on: Jan 20, 2026
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2026 Mattia Antonini, Massimo Vecchio, Fabio Antonelli, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.