-1

I have a java app rendering some sprites using LWJGL and OpenGL. It works fine until I move it to remote virtual machine with no phisical graphic card, Mesa 3d with related stuff is used to emulate it there. When I connect via ssh and start job manually it works fine. But running as a cron job it throws an exception

Caused by: java.lang.RuntimeException: org.lwjgl.LWJGLException: Could not open X display connection
        at org.lwjgl.opengl.Display.<clinit>(Display.java:141) ~[lwjgl.jar:na]
        ... 7 common frames omitted
Caused by: org.lwjgl.LWJGLException: Could not open X display connection
        at org.lwjgl.opengl.LinuxDisplay.openDisplay(Native Method) ~[lwjgl.jar:na]

Obviously something is wrong with X11.

Another update:

I found out it fails at the check for either Xrandr or XF86VidMode are supported. I got them installed but they kind of disabled. I tried to explicitly add RANDR but that didn't help.

xvfb-run -a '--server-args=+extension RANDR -screen 0 1024x768x16' /home/username/start.sh

2 Answers2

3

X11 servers don't simply get started on demand. There has to be a running X11 server, your application has to know about the existence of the X11 server it should be using via the DISPLAY variable, and the account must be authorized to connect to the X11 server that the DISPLAY variable is pointing to (xauth).

X11 forwarding built into SSH is great because it deals with all these details automatically in a way you don't have to know anything. But this won't help you from a cron.

Zoredache
  • 133,737
2

Your problem is, that the program expects a feature, which may be present on a "real" X server, but which is absent from Xvfb.

Adding features to Xvfb is probably more work than you are willing to put into this. So you'll be looking for alternatives, of which there are some:

  • Get the application to stop depending on the display mode extension.
  • Run the application on a "real" X server instead.
  • Run the application on another X server.

If it is your own application, you should be able to figure out at what point it tries to use the display mode extension. Might the application be unhappy with the settings you chose for Xvfb? If you are lucky, it may be as simple as the application requiring 32 bit color depth, and specifying x32 on the command line rather than x16 would help.

A real X server might still be an option, even on a virtual machine. It depends on the capabilities on graphics emulation on that virtual machine.

Alternatively it may be worth trying with Xvnc, which I believe has more features than Xvfb.

kasperd
  • 31,086