AEStream sends event-based data from A to B. AEStream is both a command-line tool an a C++/Python library with built-in GPU-acceleration for use with PyTorch, and Jax. We support reading and writing from files, event cameras, network protocols, and visualization tools.
Read more about the inner workings of the library in the AEStream publication.
Installation¶
Read more in our installation guide
The fastest way to install AEStream is by using pip: pip install aestream
.
Source |
Installation |
Description |
---|---|---|
|
Standard installation |
|
|
Command-line interface |
|
Contributions to support AEStream on additional platforms are always welcome.
Usage (Python): Load event files¶
Read more in our Python usage guide
AEStream can process .csv
, .dat
, .evt3
, and .aedat4
files like so.
You can either directly load the file into memory
FileInput("file.aedat4", (640, 480)).load()
or stream the file in real-time to PyTorch, Jax, or Numpy
with FileInput("file.aedat4", (640, 480)) as stream:
while True:
frame = stream.read("torch") # Or "jax" or "numpy"
...
Usage (Python): stream data from camera or network¶
Streaming data is particularly useful in real-time scenarios. We currently support Inivation, Prophesee, and SynSense devices over USB, as well as the SPIF protocol over UDP. Note: requires local installation of drivers and/or SDKs (see installation guide).
# Stream events from a DVS camera over USB
with USBInput((640, 480)) as stream:
while True:
frame = stream.read() # A (640, 480) Numpy tensor
...
# Stream events from UDP port 3333 (default)
with UDPInput((640, 480), port=3333) as stream:
while True:
frame = stream.read("torch") # A (640, 480) Pytorch tensor
...
More examples can be found in our example folder.
Please note the examples may require additional dependencies (such as Norse for spiking networks or PySDL for rendering). To install all the requirements, simply stand in the aestream
root directory and run pip install -r example/requirements.txt
Example: real-time edge detection with spiking neural networks¶
We stream events from a camera connected via USB and process them on a GPU in real-time using the spiking neural network library, Norse using fewer than 50 lines of Python.
The left panel in the video shows the raw signal, while the middle and right panels show horizontal and vertical edge detection respectively.
The full example can be found in example/usb_edgedetection.py
Usage (CLI)¶
Read more in our CLI usage documentation page
Installing AEStream also gives access to the command-line interface (CLI) aestream
.
To use aestraem
, simply provide an input
source and an optional output
sink (defaulting to STDOUT):
aestream input <input source> [output <output sink>]
Supported Inputs and Outputs¶
Input |
Description |
Example usage |
---|---|---|
DAVIS, DVXPlorer |
Inivation DVS Camera over USB |
|
EVK Cameras |
Prophesee DVS camera over USB |
|
File |
Reads |
|
Stream events via ZMQ |
|
|
UDP network |
Receives stream of events via the SPIF protocol |
|
Output |
Description |
Example usage |
---|---|---|
STDOUT |
Standard output (default output) |
|
Ethernet over UDP |
Outputs to a given IP and port using the SPIF protocol |
|
File: |
Output to |
|
File: |
Output to comma-separated-value (CSV) file format |
|
Viewer |
View live event stream |
|
CLI examples¶
Example |
Syntax |
---|---|
View live stream of Inivation camera (requires Inivation drivers) |
|
Stream Prophesee camera over the network to 10.0.0.1 (requires Metavision SDK) |
|
Convert |
|
Acknowledgments¶
AEStream is developed by (in alphabetical order):
Cameron Barker (@GitHub cameron-git)
Juan Pablo Romero Bermudez (@GitHub jpromerob)
Alexander Hadjivanov (@Github cantordust)
Emil Jansson (@GitHub emijan-kth)
Jens E. Pedersen (@GitHub jegp)
Christian Pehle (@GitHub cpehle)
The work has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Fundation) under Germany’s Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).
Thanks to Philipp Mondorf for interfacing with Metavision SDK and preliminary network code.
Citation¶
Please cite aestream
if you use it in your work:
@inproceedings{10.1145/3584954.3584997,
author = {Pedersen, Jens Egholm and Conradt, Jorg},
title = {AEStream: Accelerated event-based processing with coroutines},
year = {2023},
isbn = {9781450399470},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3584954.3584997},
doi = {10.1145/3584954.3584997},
booktitle = {Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference},
pages = {86–91},
numpages = {6},
keywords = {coroutines, event-based vision, graphical processing unit, neuromorphic computing},
location = {San Antonio, TX, USA, },
series = {NICE '23}
}