You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using the new to_dataframe() function that was implemented in #380
One issue that I'm seeing is that when loading some of the waveform signals from https://physionet.org/content/mimic3wdb-matched/1.0/ using to_dataframe() it eats up a lot of memory. Specifically, on the machine I'm running on which has 96gb of memory, reading the record and calling to_dataframe runs out of memory.
I would like to lazy load the signal data into a chunked dataframe which would allow me to process the waveform signals in parts that could fit into memory, rather than loading it all into memory.