Hello Guest, please login or register.
Did you miss your activation email?
Login with username, password and session length.

Pages: [1]   Go Down

Author Topic: Black Alpha Channel (OpenGL)  (Read 1590 times)

0 Members and 1 Guest are viewing this topic.
Black Alpha Channel (OpenGL)
« on: March 03, 2009, 03:14:35 pm »
  • Doesn't afraid of anything
  • *
  • Reputation: +42/-0
  • Offline Offline
  • Gender: Male
  • Posts: 7002
I've been playing around with the png format with FreeImage as of late. I've managed to load in PNG images and get them onto a quad as a texture, however the alpha channel shows up as black. I've enabled GL_BLEND and looked around quite a bit on google, but I've turned up nothing to help me. Here's my code that I'm using..

   
Code: [Select]
    void HVSTGFX::initGL()
    {
    glEnable(GL_DEPTH_TEST);
    glEnable(GL_CULL_FACE);
    glFrontFace(GL_CCW);
    glEnable(GL_TEXTURE_2D);
    glEnable(GL_BLEND);


    }

This is called on WM_CREATE. As you can see, I have GL_BLEND in there.

   
Code: [Select]
    unsigned char * HVSTGFX::loadPNGFile(char *fileName, PNGFILE *pngfile)
    {
    FREE_IMAGE_FORMAT fif = FIF_PNG;

    //pointer to the image, once loaded
    FIBITMAP *dib(0);
    //pointer to the image data
    unsigned char* bits(0);
    unsigned char tempRGB;


    if(FreeImage_FIFSupportsReading(fif))
    dib = FreeImage_Load(fif, fileName);

    if(!dib)
    return NULL;


    bits = FreeImage_GetBits(dib);
    pngfile->width = FreeImage_GetWidth(dib); pngfile->height = FreeImage_GetHeight(dib);
    int size = pngfile->width*pngfile->height;//(FreeImage_GetWidth(dib) * FreeImage_GetHeight(dib));

    for (int imageIDx = 0; imageIDx < size*(pngfile->width/16); imageIDx += 4)
    {
    tempRGB = bits[imageIDx];
    bits[imageIDx] = bits[imageIDx + 2];
    bits[imageIDx + 2] = tempRGB;
    }
    //FreeImage_Unload(dib);
    return bits;
    }


This is my PNG loader.

   
Code: [Select]
    void HVSTGFX::createSpritePNG(float width, float height, float x, float y, CSprite sprite)
    {

    if (!sprite.loaded)
    {
    //glGenTextures(texCount++, &sprite.texture);
    //glBindTexture(GL_TEXTURE_2D, sprite.texture);
    sprite.loaded = true;
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    }
    else
    glColor4f(1.0f,1.0f,1.0f, .0f );

    glBegin(GL_QUADS);
    glTexCoord2f(0.0f, 0.0f); glVertex3f(x, y, 0.0f);
    glTexCoord2f(1.0f, 0.0f); glVertex3f(x + width, y, 0.0f);
    glTexCoord2f(1.0f, 1.0f); glVertex3f(x + width, y + height, 0.0f);
    glTexCoord2f(0.0f, 1.0f); glVertex3f(x, y + height, 0.0f);
    glEnd();
    }

This actually draws the given image.

   
Code: [Select]
    glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pngTest.pngFile.width, pngTest.pngFile.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, pngTest.sprite);
    HVSTGFX::createSpritePNG(.4f, .54f, -.5f, -.5f, pngTest);

And this is in my main loop. As I said, the image gets textured and draws without a problem, but the alpha channel is just black. Anyone able to help?
Logged



i love big weenies and i cannot lie
Re: Black Alpha Channel (OpenGL)
« Reply #1 on: March 04, 2009, 12:50:23 am »
  • If not now, when?
  • *
  • Reputation: +0/-0
  • Offline Offline
  • Gender: Male
  • Posts: 520
I think there are a few things that could be going wrong...
  • FreeImage_GetBits does not contain alpha information.
  • FreeImage_GetBits does not contain alpha information in the format that OpenGL expects.*
  • The image was not saved with an actual alpha channel.

Unfortunately I cannot really suggest a solution to your problem because I am not an expert with the FreeImage library. I would recommend researching the format of the bits in the buffer pointed to by FreeImage_GetBits and using the information below to figure out if it is a problem with your program, the FreeImage library, or the image itself.

*Using GL_RGBA and GL_UNSIGNED_BYTE for glTexImage2D means that every unsigned byte in the buffer you pass to glTexImage2D will contain two bits for red/green/blue/alpha, in that order.
Logged
Re: Black Alpha Channel (OpenGL)
« Reply #2 on: March 04, 2009, 03:47:51 am »
  • *
  • Reputation: +9/-0
  • Offline Offline
  • Posts: 728
Hrmm, do you have alpha testing on?

Change this in your main loop code to test:

Code: [Select]
OpenGL.glEnable(GLOption.Blend);
OpenGL.glEnable(GLOption.AlphaTest);
OpenGL.glAlphaFunc(GLAlphaFunc.Greater, 0.0f);
OpenGL.glBlendFunc(GLBlendSrc.SrcAlpha, GLBlendDest.OneMinusSrcAlpha);

Excuse my .net code, but you'll get the idea.
« Last Edit: March 04, 2009, 03:50:16 am by Xfixium »
Logged
  • Pyxosoft
Re: Black Alpha Channel (OpenGL)
« Reply #3 on: March 04, 2009, 04:10:04 am »
  • Doesn't afraid of anything
  • *
  • Reputation: +42/-0
  • Offline Offline
  • Gender: Male
  • Posts: 7002
Wow, that worked perfectly.  Thanks a bunch =D
Logged



i love big weenies and i cannot lie
Re: Black Alpha Channel (OpenGL)
« Reply #4 on: March 04, 2009, 01:40:14 pm »
  • If not now, when?
  • *
  • Reputation: +0/-0
  • Offline Offline
  • Gender: Male
  • Posts: 520
Well, that works too... Nice catch, Xfixium!
Logged
Pages: [1]   Go Up

 


Contact Us | Legal | Advertise Here
2013 © ZFGC, All Rights Reserved



Page created in 0.201 seconds with 47 queries.

anything