Shaders?

19 views
Skip to first unread message

Mike Tajmajer

unread,
Jun 17, 2010, 10:37:11 AM6/17/10
to fog...@googlegroups.com
Hi Petr,

I have made the branch -- if you can stub the "shader" object into the
pipeline, I can start to experiment.


I would focus on the pixel (fragment) shader first, since it is what I would
be using the most.


This would be able to handle gradients and textures as well.

http://www.lighthouse3d.com/opengl/glsl/index.php?fragmentp

http://www.lighthouse3d.com/opengl/glsl/index.php?texture

I haven't read too much about the vertex shaders yet -- so I can't speak of
where they would be.

I have dug into Cg, and think this would be the language to use.

The Cg would be compiled into ARB code.

In the Fog software renderer, the ARB code would be parsed and turned into
x86/x64 via ASMJit.

This would be executed for each pixel as they were rendered.

I am thinking about using Boost Spirit for the parser -- it is a header only
library, but would add a dependency for Boost to the new module.

Support for multiple color depths would be built when the JIT code was
built.

For example, the assignment in this fragment would be converted to a
framebuffer write operation which would be specific to the color depth.

void main()
{
gl_FragColor = gl_Color;
}


Mike Tajmajer

unread,
Jun 18, 2010, 1:17:13 PM6/18/10
to fog...@googlegroups.com

Petr Kobalíček

unread,
Jun 19, 2010, 6:16:14 AM6/19/10
to fog...@googlegroups.com
Hi Mike,

On Thu, Jun 17, 2010 at 4:37 PM, Mike Tajmajer <mi...@igraphicsgroup.com> wrote:
> Hi Petr,
>
> I have made the branch -- if you can stub the "shader" object into the
> pipeline, I can start to experiment.

Good!

> I would focus on the pixel (fragment) shader first, since it is what I would
> be using the most.

Yeah, I think that pixel shaders is what we need

> This would be able to handle gradients and textures as well.
>
> http://www.lighthouse3d.com/opengl/glsl/index.php?fragmentp
>
> http://www.lighthouse3d.com/opengl/glsl/index.php?texture
>
>
>
> I haven't read too much about the vertex shaders yet -- so I can't speak of
> where they would be.
>

Okay,

so the fragment shader is something between the texture and blender.
We can implement blenders later, although it's probably simpler task
:)

>
> I have dug into Cg, and think this would be the language to use.
>
> The Cg would be compiled into ARB code.
>
> In the Fog software renderer, the ARB code would be parsed and turned into
> x86/x64 via ASMJit.
>
> This would be executed for each pixel as they were rendered.

Yes, since ARB is vector language we can use very easily SSE2. I think
that most of work will be to write parser and compiler, and maybe
optimizer. We can use some code from MathPresso.

> I am thinking about using Boost Spirit for the parser -- it is a header only
> library, but would add a dependency for Boost to the new module.

Sorry, but code that is dependent to boost will be never accepted into
trunk. I think that it will be not too much work to make a tokenizer,
parser and AST. Boost can save us 200 lines of code, it's really big
dependency I'd like to avoid.

>
> Support for multiple color depths would be built when the JIT code was
> built.
>
> For example, the assignment in this fragment would be converted to a
> framebuffer write operation which would be specific to the color depth.
>
> void main()
>        {
>        gl_FragColor = gl_Color;
>        }

Yes, this is exactly why the shader concept is good - agnostic to
pixel formats and depth.

--
Best regards
- Petr Kobalicek <http://kobalicek.com>

Mike Tajmajer

unread,
Jun 19, 2010, 9:00:43 AM6/19/10
to fog...@googlegroups.com
Hi Petr,

OK -- No Boost.

--------------------------
Mike Tajmajer
iGraphicsGroup.com, Inc.

Notice: This e-mail transmittal is a confidential communication from
iGraphicsGroup.com, Inc., and is intended only for the use of the individual
addressed. This information may be confidential, privileged or not subject
to disclosure under applicable law or regulation. Any dissemination, copying
or distribution of this communication is strictly prohibited. If you have
received this communication in error, please immediately notify us by return
email and delete the message. Thank you for your cooperation.

Mike Tajmajer

unread,
Jun 22, 2010, 10:52:28 AM6/22/10
to fog...@googlegroups.com
Hi Petr,

I've got my parser parsing - though only enough to test with now.

I think the next step is to look at how the shaders will be connected to the
Fog rasterizer.

Do you have any idea when you'll have this stubbed in?

I would like to implement a test with a simple shader -- from Cg to the
screen.

Petr Kobalíček

unread,
Jun 22, 2010, 11:07:45 AM6/22/10
to fog...@googlegroups.com
Hi Mike,

I have an idea how to do it.

I will look at the architecture into the SVN repo. If I get some time
tonight then I will try to integrate it with Fog (we can start with
fill-rect integration, it's the easiest one ;)).

--
Best regards
- Petr Kobalicek <http://kobalicek.com>

Mike Tajmajer

unread,
Jun 22, 2010, 11:11:13 AM6/22/10
to fog...@googlegroups.com
HI Petr,

No rush!

If you have a project that is paying money, you should do that first :-)


Petr Kobalíček

unread,
Jun 26, 2010, 1:48:31 AM6/26/10
to fog...@googlegroups.com
Hi Mike,

is your code somewhere available so I can look at it? Maybe I get some
free time so I can work on integrating such think into the shader
branch.

--
Best regards
- Petr Kobalicek <http://kobalicek.com>

Mike Tajmajer

unread,
Jun 26, 2010, 10:55:29 AM6/26/10
to fog...@googlegroups.com
Hi Petr,

> is your code somewhere available so I can look at it? Maybe I get some
free
> time so I can work on integrating such think into the shader branch.

The short answer is: not yet.

I was waiting to see how it would integrate into the Fog pipeline before I
spent a lot of time on the compiler.

Today, I have proof of concept code.

The assumptions I have made are that:

1) The fragment shader is compiled into a standard function
(c_.newFunction(AsmJit::CALL_CONV_DEFAULT,
AsmJit::BuildFunction1<void*>());)

2) That function would take a single argument, which was a pointer to the
ARBfp Context.

3) The ARBfp Context would contain the result, fragment and primitive
constructs.

4) The ARBfp Context would also have (aligned) scratch memory - to load
immediate floats into the XMM registers, etc.

5) The ARBfp Context would have space to allow for shader defined PARAM
data.

How it would connect with Fog?

I did guess that there would be a method in the painter which would allow a
fragment shader to be attached.

OpenGL does it via this style:

fragment_shader = compile(GL_FRAGMENT_SHADER, f_source);
prog = glCreateProgram();
glAttachShader(prog, fragment_shader);
glLinkProgram(prog);

Inside of FOG, there would be a ARBfb context, and a pointer to the
function.

The context would preserved with the function - context data would persist
as long as the fragment was attached.

When it was time to render the pixels, the context would be loaded with the
data for each pixel (fragment.*) then called.

After the fragment returned, the result.color would be converted from 4
floats to RGB and added to the Span (or put directly into the Image).

I guess that the worse case would be that each Span would degrade into
individual pixels.

Petr Kobalíček

unread,
Jun 28, 2010, 2:15:14 PM6/28/10
to fog...@googlegroups.com
Hi Mike,

I hoped that you will start with something easy, something like
framebuffer:) For example my first testing environment when working on
BlitJit was SDL (I simply loaded few images and tested how the JIT
compiler will do with them).


On Sat, Jun 26, 2010 at 4:55 PM, Mike Tajmajer <mi...@igraphicsgroup.com> wrote:
> Hi Petr,
>
>> is your code somewhere available so I can look at it? Maybe I get some
> free
>> time so I can work on integrating such think into the shader branch.
>
> The short answer is: not yet.
>
> I was waiting to see how it would integrate into the Fog pipeline before I
> spent a lot of time on the compiler.
>
> Today, I have proof of concept code.
>
>
>
> The assumptions I have made are that:
>
> 1) The fragment shader is compiled into a standard function
> (c_.newFunction(AsmJit::CALL_CONV_DEFAULT,
> AsmJit::BuildFunction1<void*>());)

Oh, AsmJit-1.0 is in SVN, use it instead ;-)

> 2) That function would take a single argument, which was a pointer to the
> ARBfp Context.

Maybe this prototype would be better, it matches Fog architecture:

typedef void (FOG_FASTCALL *RasterVBlitSpanFn)(
uint8_t* dst, const Span* span, const RasterClosure* closure);

This kind of function is called for whole scanline so blitter can
cache some computations for Y coordinate. I don't know if shader need
to do that, but we could save some function calls here;) Of course the
painting architecture will be extended to be used with shaders, but
mathing same scheme is good, I think.

> 3) The ARBfp Context would contain the result, fragment and primitive
> constructs.
>
> 4) The ARBfp Context would also have (aligned) scratch memory - to load
> immediate floats into the XMM registers, etc.

Not understand this, we could use stack for this, coulnd't we?

> 5) The ARBfp Context would have space to allow for shader defined PARAM
> data.

Simply parameters:) We should make difference between constants and
parameters. Folding constants may improve the performance, but this is
definitelly something that should be done in future.

>
> How it would connect with Fog?
>
> I did guess that there would be a method in the painter which would allow a
> fragment shader to be attached.
>
> OpenGL does it via this style:
>
> fragment_shader = compile(GL_FRAGMENT_SHADER, f_source);
> prog = glCreateProgram();
> glAttachShader(prog, fragment_shader);
> glLinkProgram(prog);
>
>
>
> Inside of FOG, there would be a ARBfb context, and a pointer to the
> function.
>
> The context would preserved with the function - context data would persist
> as long as the fragment was attached.
>
> When it was time to render the pixels, the context would be loaded with the
> data for each pixel (fragment.*) then called.
>
> After the fragment returned, the result.color would be converted from 4
> floats to RGB and added to the Span (or put directly into the Image).
>
> I guess that the worse case would be that each Span would degrade into
> individual pixels.

My API imagination is like this:

Fog::Shader shader;
shader.setProgram("some program...", shaderType /* optional? defaults to?*/);

Fog::Painter painter;
painter.setFragmentShader(shader);

...

But what I not understand is, what is role of fragment shader in 2d graphics?

Imagine this scenario:

Painter p;
p.setSource(Argb(0xFFFFFFFF));
p.setOperator(OPERATOR_SRC_OVER);
p.fillRect(IntRect(x, y, w, h));

if I assign fragment shader, what it should do?

According to http://www.lighthouse3d.com/opengl/glsl/index.php?pipeline
(pipeline overview) it should fetch the source, so it seems that this
way would be probably better (it also matches what shader will do):

Shader shader;
shader.setProgram("some program...", shaderType /* optional? defaults to?*/);

Pattern pattern;
pattern.setShader(shader);

Painter painter;
painter.setPattern(pattern);

--------------------------------------

We can also extend shaders to do pixel blending, so there will be 2
type of shaders - fragment (to fetch source) and raster (to blend
source to destination, accepting possible coverity-mask).

Best regards
- Petr

Reply all
Reply to author
Forward
0 new messages