This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Hello
So I'm wondering if there's a built-in/better way of calculating the duration of a signal that alternates between 0 and 1. Basically, I want to get the time duration between a set of continuous '1's and then get the mean for every sequence of continuous 1s found.
I don't know if I'm making sense, but something like: [1,1,1][0,0][1,1,1,1] -- There are two sequences and I want to get the duration of those sequences and divide them by 2 to get the mean.
I was able to create a solution by using/abusing iterrows to store the sequences into a list. Then I'd iterate through the list of series, calculate total seconds by taking the difference between last date and first date, store that duration in a list, iterate through that duration list and calculate the mean from there.
My solution is quite slow though, and it takes a few seconds to run on a timeseries of ~80krows. Who know how bad it'll get if I need to increase the number of rows in the future.
I figured, something like this is probably a very common operation so there must be a better solution out there. Any help with this problem is really appreciated. Thanks in advance.
Subreddit
Post Details
- Posted
- 4 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/learnpython...