Hi All,
I am working with a dataset, where my idea is to determine the reasons and predictors for why patients' stay at the long-term rehab facility is interrupted. In order to find out which patients have gaps between the end date of their last hospital stay and the start date of the most subsequent one, I want to make sure I create a code that captures patients whose begin date - end date > 0. The challenge is that dates I want to subtract from each other belong to different rows and are under separate entries (aka. hospital stays). This dataset has over 300k entries, so I would like to find an efficient way to find these individuals and then collapse the dataset so that I have one row for each patient with a number of days they were out of the facility between the very first begin date and the very last end date. Here is the table from the dataset that explains it.
px_id Facility Code Begin date End Date
1 656 03/03/15 06/03/15
1 656 06/19/15 09/20/15
1 656 11/20/15 02/21/16
2 123 01/02/16 01/20/16
2 123 01/27/16 02/18/16
Thank you!
I am working with a dataset, where my idea is to determine the reasons and predictors for why patients' stay at the long-term rehab facility is interrupted. In order to find out which patients have gaps between the end date of their last hospital stay and the start date of the most subsequent one, I want to make sure I create a code that captures patients whose begin date - end date > 0. The challenge is that dates I want to subtract from each other belong to different rows and are under separate entries (aka. hospital stays). This dataset has over 300k entries, so I would like to find an efficient way to find these individuals and then collapse the dataset so that I have one row for each patient with a number of days they were out of the facility between the very first begin date and the very last end date. Here is the table from the dataset that explains it.
px_id Facility Code Begin date End Date
1 656 03/03/15 06/03/15
1 656 06/19/15 09/20/15
1 656 11/20/15 02/21/16
2 123 01/02/16 01/20/16
2 123 01/27/16 02/18/16
Thank you!
Comment