Hash :
bc9888c9
Author :
Date :
2021-06-12T14:55:24
OS2_GetDisplayModes: malloc a new copy of mode's driver data.
Based on a patch by Jochen Schäfer <josch1710@live.de> :
The problem is, that in the initialization code uses the same structure for
desktop_mode and current_mode. See SDL_os2video.c:OS2_VideoInit():
stSDLDisplay.desktop_mode = stSDLDisplayMode;
stSDLDisplay.current_mode = stSDLDisplayMode;
...
stSDLDisplayMode.driverdata = pDisplayData;
Then, if you call GetDisplayModes, current_mode will added to the modes
list, with the same driverdata pointer to desktop_mode.
SDL_AddDisplayMode( display, &display->current_mode );
When VideoQuit gets called, first the modes list gets freed including the
driverdata, the desktop_mode gets freed. See SDL_video.c:SDL_VideoQuit():
for (j = display->num_display_modes; j--;) {
SDL_free(display->display_modes[j].driverdata);
display->display_modes[j].driverdata = NULL;
}
SDL_free(display->display_modes);
display->display_modes = NULL;
SDL_free(display->desktop_mode.driverdata);
display->desktop_mode.driverdata = NULL;
So, the display_modes[j].driverdata gets freed, but desktop_mode->driverdata
points to the same memory, but is not NULL'ed. When desktop_mode->driverdata
gets freed the memory is already freed, and libcx crashes the application on
SDL_Quit.