Performance comparison of six different LTTB (visual downsampling for timeseries data) algorithm implementations for Python
LTTB, or Largest-Triangle-Three-Buckets, is a fantastic little algorithm that you can use for the visual downsampling of timeseries data.
Let’s say your user is viewing a line chart of some timeseries data. The time-period they have selected contains 50000 points, but their display is only 4K so they have a maximum of 3840 pixels available horizontally. With LTTB, we can automatically select the 3840 or fewer points from those 50000 points that will produce a line graph which is visually very similar to what they would see if they were to try and render all 50000 points.