[FXRuby] FXGLViewer and textures

Windows 2000
Ruby 173-7
FXRuby 1.0.20

I want to use textures in a FXGLViewer. But there seems to be some
initialization for the OpenGL system missing. When calling
GL.GenTextures a huge number is returned. The OpenGL samples from the
samples/opengl folder return 1 or 2 always. The difference is that
these examles are using GLUT for the GUI – not FXRuby.

I have attached an example for my problem. The program prints

rubyw gltexture.rbw
Generated texture number 44171928
Generate mipmap
gltexture.rbw:40: [BUG] Segmentation fault
Exit code: 3

when executed. The segmentation fault seems to be a consequence of the
invalid texture number.

Does anybody know, what initialization I’m missing?

Rasmus

Use textures with FXGLViewer

Load modules

require ‘fox’
require ‘opengl’
require ‘fox/glshapes’

include Fox

Object for the FXGLViewer

class TestGLIcon < FXGLObject
def initialize(texturedata, width, height, points)
# Get a new texture name …
@texture = GL.GenTextures(1).first

    # ... and print it.
    # The examples from the samples/opengl folder
    # return 1 or 2 here. Seems, that this call
    # fails already
    $stdout.print "Generated texture number ", @texture, "\n"
    $stdout.flush
    
    # Setup for texture
    # Texture data is a buffer of gray values
    GL.ClearColor(0.0, 0.0, 0.0, 0.0)
    GL.ShadeModel(GL::FLAT)
    GL.Enable(GL::DEPTH_TEST)
    GL.PixelStorei(GL::UNPACK_ALIGNMENT, 1)

    # Generate texture data
    GL.BindTexture(GL::TEXTURE_2D, @texture)
    GL.TexParameteri(GL::TEXTURE_2D, GL::TEXTURE_WRAP_S,

GL::CLAMP)
GL.TexParameteri(GL::TEXTURE_2D, GL::TEXTURE_WRAP_T,
GL::CLAMP)
GL.TexParameteri(GL::TEXTURE_2D,
GL::TEXTURE_MAG_FILTER,GL::LINEAR)
GL.TexParameteri(GL::TEXTURE_2D,
GL::TEXTURE_MIN_FILTER,GL::LINEAR_MIPMAP_LINEAR)

    $stdout.print "Generate mipmap\n"
    $stdout.flush

    GLU.Build2DMipmaps(
        GL::TEXTURE_2D, 4, 
        width, height,
        GL::LUMINANCE, GL::UNSIGNED_BYTE, 
        texturedata)
    
    # Map the corner to this quad
    @corners = points.map {|f| FXGLPoint.new(f[0], f[1], f[2])}
end

def bounds
    # Calculate range
    FXRange.new(
        @corners.map {|c| c.pos[0]}.min,
        @corners.map {|c| c.pos[0]}.max,
        @corners.map {|c| c.pos[1]}.min,
        @corners.map {|c| c.pos[1]}.max,
        @corners.map {|c| c.pos[2]}.min,
        @corners.map {|c| c.pos[2]}.max
    )
end

def draw(viewer)
    # Setup for texture drawing
    GL.Clear(GL::COLOR_BUFFER_BIT | GL::DEPTH_BUFFER_BIT)
    GL.BindTexture(GL::TEXTURE_2D, @texture)
    
    # Draw the texture
    GL.Begin(GL::QUADS)
    GL.TexCoord(0.0, 0.0); GL.Vertex(@corners[0])
    GL.TexCoord(0.0, 1.0); GL.Vertex(@corners[1])
    GL.TexCoord(1.0, 1.0); GL.Vertex(@corners[2])
    GL.TexCoord(1.0, 0.0); GL.Vertex(@corners[3])
    GL.End()
end

def hit(viewer)
    GL.Begin(GL::QUADS);
    GL.Vertex(@corners[0]);
    GL.Vertex(@corners[1]);
    GL.Vertex(@corners[2]);
    GL.Vertex(@corners[3]);
    GL.End();
end

end

class TestViewer
def initialize
end

def scene(gentexture = 1)
    newscene = FXGLGroup.new
    
    shapes = FXGLGroup.new
    newscene.append(shapes)
    
    shapes.append(FXGLCube.new(0.5, 0.5, 0.5, 0.2, 0.2, 0.2))
    
    if gentexture == 1
        textures = FXGLGroup.new
        newscene.append(textures)

        width  = 64
        height = 64
        image = []
        for i in 0...height
        for j in 0...width
            c = if ((i&0x8==0)!=(j&0x8==0)) then 255 else 0 end
            image[i*width+j] = c;
        end
        end

        textures.append(TestGLIcon.new(
            image.pack("C*"), width, height,
            [[0, 0, 0], [0, 1, 0], [1, 1, 0], [1, 0, 0]]
            ))
    end
    
    newscene
end

end

class TestWindow < FXMainWindow
def initialize(app)
# Invoke the base class initializer
super(app, “Test”, nil, nil, DECOR_ALL, 0, 0, 800, 600)

    # Make a tool tip
    FXTooltip.new(getApp(), 0)
    
    # GL Visual
    @glvisual = FXGLVisual.new(
        getApp(),
        VISUAL_DOUBLEBUFFER
        )

    # Drawing GL canvas
    @viewer = FXGLViewer.new(
        self, 
        @glvisual, 
        nil, 
        0,
        LAYOUT_FILL_X|LAYOUT_FILL_Y|LAYOUT_TOP|LAYOUT_LEFT
        )
end

def scene(aScene)
    @viewer.scene = aScene
end
    
def create
    super
    show(PLACEMENT_SCREEN)
end

end

if FILE == $0
application = FXApp.new(“Test”, “Test”)
window = TestWindow.new(application)
application.create
viewer = TestViewer.new
window.scene viewer.scene(1)
application.run
end

Rasmus wrote:

I want to use textures in a FXGLViewer. But there seems to be some
initialization for the OpenGL system missing. When calling
GL.GenTextures a huge number is returned. The OpenGL samples from the
samples/opengl folder return 1 or 2 always. The difference is that
these examples are using GLUT for the GUI – not FXRuby.

I have attached an example for my problem. The program prints

rubyw gltexture.rbw
Generated texture number 44171928
Generate mipmap
gltexture.rbw:40: [BUG] Segmentation fault
Exit code: 3

when executed. The segmentation fault seems to be a consequence of the
invalid texture number.

Does anybody know, what initialization I’m missing?

Yes, you need to make the GL viewer’s context the “current” context
before calling stuff like GL.GenTextures. The least intrusive changes to
your example program would be to first add an accessor to the TestWindow
class to return a reference to the FXGLViewer window, i.e.

 class TestWindow < FXMainWindow

     attr_reader :viewer

     # ... remaining code as before ...
 end

and then add calls to FXGLViewer#makeCurrent and
FXGLViewer#makeNonCurrent before and after the call to TestViewer#scene,
i.e.

 if __FILE__ == $0
   application = FXApp.new("Test", "Test")
   window = TestWindow.new(application)
   application.create
   viewer = TestViewer.new

   # Make the GL viewer's context current
   window.viewer.makeCurrent

   # Generate textures, etc.
   window.scene viewer.scene(1)

   # Make the GL viewer's context non-current
   window.viewer.makeNonCurrent

   application.run
 end

I see some other problems, e.g. you’re trying to pass FXGLPoint objects
in as arguments to GL.Vertex, which expects an Array. But I’ll let you
debug those yourself first :wink: If you run into additional problems, let
me know.

Hope this helps,

Lyle