OS2_GetDisplayModes: malloc a new copy of mode's driver data.

Based on a patch by Jochen Schäfer <josch1710@live.de> :

The problem is, that in the initialization code uses the same structure for
desktop_mode and current_mode.  See SDL_os2video.c:OS2_VideoInit():

  stSDLDisplay.desktop_mode = stSDLDisplayMode;
  stSDLDisplay.current_mode = stSDLDisplayMode;
  ...
  stSDLDisplayMode.driverdata = pDisplayData;

Then, if you call GetDisplayModes, current_mode will added to the modes
list, with the same driverdata pointer to desktop_mode.

  SDL_AddDisplayMode( display, &display->current_mode );

When VideoQuit gets called, first the modes list gets freed including the
driverdata, the desktop_mode gets freed.  See SDL_video.c:SDL_VideoQuit():

  for (j = display->num_display_modes; j--;) {
      SDL_free(display->display_modes[j].driverdata);
      display->display_modes[j].driverdata = NULL;
  }
  SDL_free(display->display_modes);
  display->display_modes = NULL;
  SDL_free(display->desktop_mode.driverdata);
  display->desktop_mode.driverdata = NULL;

So, the display_modes[j].driverdata gets freed, but desktop_mode->driverdata
points to the same memory, but is not NULL'ed. When desktop_mode->driverdata
gets freed the memory is already freed, and libcx crashes the application on
SDL_Quit.
This commit is contained in:
Ozkan Sezer 2021-06-12 14:55:24 +03:00
parent d28437de3c
commit bc9888c9b5

View File

@ -1562,8 +1562,14 @@ static int OS2_GetDisplayDPI(_THIS, SDL_VideoDisplay *display, float *ddpi,
static void OS2_GetDisplayModes(_THIS, SDL_VideoDisplay *display)
{
SDL_DisplayMode mode;
debug_os2("Enter");
SDL_AddDisplayMode(display, &display->current_mode);
SDL_memcpy(&mode, &display->current_mode, sizeof(SDL_DisplayMode));
mode.driverdata = (MODEDATA *) SDL_malloc(sizeof(MODEDATA));
if (!mode.driverdata) return; /* yikes.. */
SDL_memcpy(mode.driverdata, display->current_mode.driverdata, sizeof(MODEDATA));
SDL_AddDisplayMode(display, &mode);
}
static int OS2_SetDisplayMode(_THIS, SDL_VideoDisplay *display,