Ozkan Sezer
The attached patch removes SDLCALL attribute from SDL_BlitFunc() funcptr.
As far as I can see, *SDL_BlitFunc() is completely internal to SDL with
no specific calling convention requirements. The actual functions assigned
to SDL_BlitFunc seem to not have any calling conventions specified. So,
easy solution is simply removing the strict calling convention from the
type.
Sylvain
Here's a patch.
It tries to get the hint first. Resizable will allow any orientation. Otherwise it uses width/height window.
setOrientation method is splitted in static and non-static, so that it can be overloaded in a user subclass.
Some artefact observed :
surfaceChanged() can be called twice at the beginning. When the phone starts in portrait and run a landscape application.
Clayton Craft
linux_input module is disabled by default, despite the comments in source code that it is otherwise:
src/video/directfb/SDL_DirectFB_video.c:
devdata->use_linux_input = readBoolEnv(DFBENV_USE_LINUX_INPUT, 0); /* default: on */
src/video/directfb/SDL_DirectFB_video.h:
#define DFBENV_USE_LINUX_INPUT "SDL_DIRECTFB_LINUX_INPUT" /* Default: on */
When using the directfb driver, the linux_input module is suppressed unless the SDL app is started with "SDL_DIRECTFB_LINUX_INPUT=1" set in the environment. I recall seeing at one point that the directfb folks recommended using linux_input over the other input drivers, but I am having trouble locating this recommendation. In any case, I believe that this should really be defaulted to 'on' since it's vastly superior to the other dfb input drivers!
Alexey
Seems to be a missing functionality. I want to set an icon from RC file. I cant pass MAKEINTRESOURCE(X) string to SDL_RegisterApp() cause string returned by MAKEINTRESOURCE string is not actually a string and SDL_strlen will crash. Moreover LoadImage seems to be loading wrong icon size. LoadIcon seems to be fine.
Edward Rudd
Device: Logitech Rumble Gamepad F510 in Xinput mode.
Upon opening the joystick the values of the axes are queried via PollAllValues are not actually set on the device all the time.
This can easily be seen in the testjoystick or testgamecontroller test programs,as the testjoystick shows all axes in the center until one 'tickles' the triggers., and the testgamecontroller will show the triggers as 'on' until on 'tickles' the triggers.
Upon further research the culprit is the SDL_HINT_JOYSTICK_ALLOW_BACKGROUND_EVENTS hint. In the default value events are ignored until there is an active window, Thus in cases where the joystick system is initialized and controllers opened before the initial window is created & focuses, the initial values will be incorrect.
Here is my current workaround in the game I'm working on porting..
SDL_SetHint(SDL_HINT_JOYSTICK_ALLOW_BACKGROUND_EVENTS, "1");
SDL_GameController* gamepad = SDL_GameControllerOpen(index);
SDL_SetHint(SDL_HINT_JOYSTICK_ALLOW_BACKGROUND_EVENTS, "0");
Edmund Horner
When a 16-bit "565 format" surface has a colour key set, it will blit with correct transparency. If, however, it has its colour key set then is converted to a 32-bit ARGB format surface, the colour key in the converted image will not necessarily be the same pixel value as the transparent pixels. It may not blit correctly, because the colour key does not match the right pixels.
In my case, with an image using 0xB54A for transparency, the colour key was converted to 180,170,82; but the corresponding pixels (with the same original value) were converted to 180,169,82. Blitting the converted image did not use transparency where expected.
I have attached a test case. The bug has been replicated on both x86_64 Linux (SDL 2.0.2), and 32-bit MS C++ 2010 on Windows (SDL 2.0.0).
malferit
Hello, I began a little program with SDL2 on Linux in C, and when I call SDL_Init(SDL_INIT_VIDEO) I get an error and this is printed in the console:
XDM authorization key matches an existing client!
I searched through Internet, and found that some people suggest to run 'xhost +' or to specify this in /etc/X11/xdm/xdm-config:
DisplayManager*authName: MIT-MAGIC-COOKIE-1
I don't think an end user needs to know that...
But what bothered me is that first I started this little program in Pascal using the Freepascal compiler and it works. In freepascal you only use some thin header bindings in Pascal and then it links with the dynamic SDL library, so I don't understood why it worked with Freepascal and not in C.
I run ldd to the two generated applications:
Application in C:
linux-gate.so.1 (0xffffe000)
libSDL2-2.0.so.0 => /usr/lib/libSDL2-2.0.so.0 (0xb76ac000)
libpthread.so.0 => /lib/libpthread.so.0 (0xb766e000)
libc.so.6 => /lib/libc.so.6 (0xb74e2000)
libm.so.6 => /lib/libm.so.6 (0xb74a0000)
libdl.so.2 => /lib/libdl.so.2 (0xb749a000)
librt.so.1 => /lib/librt.so.1 (0xb7491000)
/lib/ld-linux.so.2 (0xb77b3000)
Application compiled with Freepascal:
linux-gate.so.1 (0xffffe000)
libSDL2-2.0.so.0 => /usr/lib/libSDL2-2.0.so.0 (0xb762a000)
libX11.so.6 => /usr/lib/libX11.so.6 (0xb74f3000)
libc.so.6 => /lib/libc.so.6 (0xb7367000)
libm.so.6 => /lib/libm.so.6 (0xb7325000)
libdl.so.2 => /lib/libdl.so.2 (0xb731f000)
libpthread.so.0 => /lib/libpthread.so.0 (0xb7305000)
librt.so.1 => /lib/librt.so.1 (0xb72fc000)
libxcb.so.1 => /usr/lib/libxcb.so.1 (0xb72dc000)
libXau.so.6 => /usr/lib/libXau.so.6 (0xb72d9000)
libXdmcp.so.6 => /usr/lib/libXdmcp.so.6 (0xb72d3000)
/lib/ld-linux.so.2 (0xb7755000)
It seems that Freepascal is linking with libX11, libxcb, libXau and libXdmcp .
Linking my C application with libxcb solved the problem (linking with libXau and/or libXdmcp without libxcb didn't work). Linking with X11 links all the other libraries and works as well.
So I fill this bug report mainly to let you know about this. I don't know if it is a problem that can be solved on the libSDL side or not, but at least I hope it will help.
Hi, some tests:
1. Disabled XDM. Login in console and running 'startx'. The program works without having to link with X11.
2. Enabled XDM. Added 'DisplayManager*authName: MIT-MAGIC-COOKIE-1' to /etc/X11/xdm/xdm-config.The program works without having to link with X11.
3. Enabled XDM without 'DisplayManager*authName: MIT-MAGIC-COOKIE-1' in /etc/X11/xdm/xdm-config . I get the authentication error unless I link with X11.
Coriiander
There's a slight mistake in the function "GetWindowStyle" found in file "SDL_windowswindow.c".
When a window is marked to be resizable, the resizable style is being added regardless of whether the window has a border or not. While for some arcane, hidden semantics this can be ok, it's still inconsistent in this case.
Witek Jachimczyk
I'm using SDL to develop a video viewer for MATLAB. The window is scrambled while using thightVNC with its default mode of RGB656.
SDL does not correctly recognize the pixel mode.
I found a solution for this problem. The solution involves modifying
SDL/src/video/SDL_pixels.c
Adding the following "if statement" under case 16: of SDL_MasksToPixelFormatEnum resolves the issue:
if (Rmask == 0x003F &&
Gmask == 0x07C0 &&
Bmask == 0xF800 &&
Amask == 0x0000) {
return SDL_PIXELFORMAT_RGB565;
}
I hope that this helps someone. I took me a while to figure it out.
Yann Dirson
When SDL_GL_GetProcAddress returns in error, the cause of the error is overwritten
in GL_GL_GetAttribute, reporting to the user "Failed getting OpenGL glGetString entry point", whereas the original "OpenGL library not loaded" never makes it
to the user.
Pushed a fix to:
f94cb13708
Note that the "OpenGL library not loaded" error looks like no root cause either,
and I'm still puzzled by the code path used: I'm forcing opengles2 renderer on
the x11 video driver on a rpi2, as in https://bugzilla.libsdl.org/3169, and although I now know that I must force the use of the RPI video driver instead
of the x11 one, I suspect even more accurate info can be given to user.
Sylvain
Let's you have a SDL_Surface that has ColorKey, but no Alpha Modulation.
When this surface is duplicated with SDL_ConvertSurface function, the result has ColorKey and Alpha Modulation (BLEND, and Opaque 255).
I think SDL_ConvertSurface should strictly keeps the input format.
example
=======
SDL_Surface *input; // ... Set up a surface with ColorKey and no AlphaMod
SDL_Surface *output = SDL_ConvertSurface(input, input->format, input->flags);
// "output" surface has a ColorKey but *also* AlphaMod (BLEND, and Opaque 255).
owen
I removed all the static variables from SDLActivity.java
Updated all the SDL_android.c jni calls as well
I added a new function to SDL_android.c/ h
void Android_JNI_SeparateEventsHint(const char* c);
This is called by SDL_androidtouch.c so that this TU doesn't need to call any JNI functions.
Philipp Wiesemann
There is another problem with the current implementation which maybe should be fixed first (to prevent some work). It was written as if it would get the number of a button from the Java side but actually it gets the state of all buttons. That is why it should not work if more than one button is pressed at once.
romain.lacroix
For the windows implementation of SDL_ShowMessageBox() : ./src/video/windows/SDL_windowsmessagebox.c:345 WIN_ShowMessageBox()
The implementation in 2.0.4 uses "button index" for parameter "id" of function AddDialogButton().
It then expects the value provided in param wParam of function MessageBoxDialogProc() to be a valid index of a button.
It uses this value to index in the array of buttons when DialogBoxIndirect() returns (line 474 : *buttonid = buttons[which].buttonid;)
However, when dismissing this box with Escape, the return value of DialogBoxIndirect will be SDL_MESSAGEBOX_BUTTON_ESCAPEKEY_DEFAULT (=2) which is not always a valid index of array buttons.
When the array buttons has a length less or equal than 2, the memory access is invalid; I can see that the value written to *buttonId is uninitialized memory (random value).
The fix I propose : use value "buttonid" (field of button) for parameter "id" of AddDialogButton(), then copy return value of DialogBoxIndirect() in *buttonid. This way, we will not use an out-of-bounds index in array buttons.
Simon Hug
The RGBA_FROM_PIXEL macro in src/video/blit.h [1] is not designed to work with more than 8 bits per channel and the ARGB2101010 format makes it read outside of the array bounds causing access violations. This can happen during blitting with the BlitNtoNPixelAlpha and SDL_Blit_Slow functions.
When SDL_InitFormat tries to calculate the loss of the channels [2], the Uint8 will wrap around and it will end up at 254 for the 10-bit channels. Clearly way over the 9 entries of the SDL_expand_byte array. (Not that a signed integer would help.) Then the macro tries to access the lookup table with the channel value which could be up to 1023. If the previous indirection didn't cause an access violation this one will.
I guess it's not worth modifying this macro for a format that only a few will use. It will only make the other blitters slower. I don't have good ideas to solve this issue.
Attached is a test case that does three blits. A copy one that work and the two that use the functions mentioned above.
[1] https://hg.libsdl.org/SDL/file/cd1994d4f3c6/src/video/SDL_blit.h#l303
[2] https://hg.libsdl.org/SDL/file/cd1994d4f3c6/src/video/SDL_pixels.c#l540
Simon Hug
Some code in SDL loads libraries with SDL_LoadObject to get more information or use newer APIs. SDL_LoadObject may fail, set an error message and SDL will continue with some fallback code. Since SDL will overwrite the error or exit the function with a return value that indicates success, the error form SDL_LoadObject for the optional stuff might as well be cleared right away.
Eric Wasylishen 2017-07-26 18:42:58 UTC
I set up an (admittedly exotic) 3-monitor setup, and when I enter fullscreen-desktop on the middle display (#2), the SDL window is off center. (covers half of monitor #2 and most of monitor #3).
The displays are arranged from left to right:
Display #1 (main): 2880x1800, 200% scaling
Display #2: 1920x1200, 150% scaling
Display #3: 1920x1080, 100% scaling
SDL display bounds:
INFO: Bounds: 1440x900 at 0,0
INFO: Bounds: 1281x801 at 1921,0 (these are incorrect)
INFO: Bounds: 1920x1080 at 4800,0
Correct bounds reported by calling EnumDisplayMonitors and printing the LPRECT param of the callback:
1440x900 at (0, 0)
1280x800 at (2880, 0)
1920x1080 at (4800, 0)
It seems like you need 3 displays to reproduce this, and the left two need DPI scaling, and the 3rd display needs to have a different scale factor than the others.
Related: https://bugzilla.libsdl.org/show_bug.cgi?id=3709
SDL: current hg (11235:6a587b9e0ec8)
Windows 10, Version 10.0.15063 Build 15063
Tested with testdraw2 and testgl2, and pressing alt+enter to enter fullscreen desktop.
This patch reworks SDL_windowsmodes.c to use EnumDisplayMonitors instead of EnumDisplayDevices, so we always have an HMONITOR for each SDL display.
With access to an HMONITOR, we can get the monitor bounds in virtual screen coordinates the proper way, by calling GetMonitorInfo. (whereas the original code was doing some calculations - e.g. "data->DeviceMode.dmPosition.x * data->ScaleX" - to try to get virtual screen coordinates. These worked in simple cases, but failed in more complex cases like this bug)
The one potential problem with my patch is, the ChangeDisplaySettingsEx docs say that you're supposed to get the display name from EnumDisplayDevices, but I'm getting the display name from GetMonitorInfo now.
Simon Hug
KMSDRM_VideoInit allocates and frees some connectors and encoders but doesn't set the pointer to NULL after freeing. The cleanup code at the end may free one of those garbage pointer should an error happen in the initialization.
Simon Hug
When WIN_WindowProc processes the WM_TOUCH message, it doesn't check if the touch functions have been properly loaded and may call a NULL pointer. It's probably an extremely rare case, but here's a patch that adds some checks anyway.
Levi Bard
In some environments, xrandr modes initialization can fail even though xrandr support is present and of a sufficient version.
(The one I encountered was an AWS instance running a virtual display)
The attached patch allows SDL to keep trying other methods if xrandr modes initialization fails (still subject to SDL_VIDEO_X11_REQUIRE_XRANDR).
Manuel
The attached patch adds support for KMS/DRM context graphics.
It builds with no problem on X86_64 GNU/Linux systems, provided the needed libraries are present, and on ARM GNU/Linux systems that have KMS/DRM support and a GLES2 implementation.
Tested on Raspberry Pi: KMS/DRM is what the Raspberry Pi will use as default in the near future, once the propietary DispmanX API by Broadcom is overtaken by open graphics stack, it's possible to boot current Raspbian system in KMS mode by adding "dtoverlay=vc4-kms-v3d" to config.txt on Raspbian's boot partition.
X86 systems use KMS right away in every current GNU/Linux system.
Simple build instructions:
$./autogen.sh
$./configure --enable-video-kmsdrm
$make
Now the clipboard isn't lost if you destroy a specific SDL_Window, as it
works on other platforms. You will still lose the clipboard data on
SDL_Quit() or process termination, but that's X11 for you; run a
Clipboard Manager daemon.
Fixes Bugzilla #3222.
Fixes Bugzilla #3718.
Eric Wasylishen
I think I found a better fix.
The problem with https://hg.libsdl.org/SDL/rev/ebdc0738b1b5 is setting the styleMask to 0 clears the NSWindowStyleMaskFullScreen bit, which then confuses Cocoa later when you try to leave fullscreen. Instead I'm just clearing the NSWindowStyleMaskResizable bit, although SetWindowStyle(window, NSWindowStyleMaskFullScreen); seems to also work.
Eric Wasylishen
Unfortunately this commit seems to have broken exiting desktop-fullscreen.
- Launch testgl2.
- Press alt+enter to go fullscreen-desktop
- Press alt+enter again. The spinning cube will freeze, and the window stays fullscreen desktop.
Simon Hug
SDL_GL_GetAttribute doesn't check if a video driver has been initialized and will access the SDL_VideoDevice pointer, which is NULL at that point.
I think all of the attributes require an initialized driver, so a simple NULL check should fix it. Patch is attached.