Changing Opacity of an Entire MonoGame SpriteBatch

Recently, while working on the Bad Echo game framework, I required the ability to apply an overriding amount of transparency to all sprites drawn within a particular SpriteBatch batch operation.

Unfortunately, as the first reply to this forum post indicates, MonoGame’s SpriteBatch class does not support such an operation. The suggestion given by the developer was to mess around with the internals of SpriteBatch to implement said functionality.

Reading this inspired me to solve this problem by doing just that, albeit in my own way. This article explores the solution I devised for this problem and the things I learned while solving it.

The Problem

Let’s spend some time reviewing what I was having trouble doing. After that, I’ll demonstrate how to enhance the SpriteBatch class to do what we want.

A color’s transparency is specified by its alpha channel, one of its components most commonly represented by the final element in a standard RGBA tuple.

A minimum alpha value is fully transparent, while a maximum value is fully opaque.

The Typical Way of Applying Transparency

Applying some degree of transparency to a texture or sprite we’re rendering is actually a straightforward affair. All we need to do is multiply the Color we’re applying to the sprite by the desired alpha (the alpha being a float-type value between 0 and 1).

SpriteBatch.Draw with Transparency
// Half transparency
spriteBatch.Draw(firstTexture, position1, Color.White * 0.5f);

// Less transparency
spriteBatch.Draw(secondTexture, position2, Color.White * 0.8f);

// Fully opaque
spriteBatch.Draw(thirdTexture, position3, Color.White);

If this satisfies all your needs, you can stop reading right now! If your rendering logic is structured so that each component draws to its own sprite batch operation, it probably does.

No Control Over Hierarchical Rendering Using a Single Batch

If you’re using a hierarchy of components, all drawing to the same SpriteBatch within a single batch operation, the above approach isn’t helpful if we wish to exert control from the root.

For example, let’s say we have a root rendering object that initiates a batch operation via SpriteBatch.Begin. After creating the batch operation, the root passes said SpriteBatch instance to its children, rendering objects responsible for calling SpriteBatch.Draw.

Given something like I just described, each of the children mentioned above will need to play ball if we wish to impose some batch-wide transparency effect.

Where I Ran Into Issues

The Bad Echo game framework includes components that match this description exactly. In particular, the user interface system it makes available. It consists of a Screen object that contains layers of controls and is responsible for measuring, arranging, and rendering on the screen.

This user interface system is made available as a discrete game state by the ScreenState class, whose Draw override is where I originally wanted a way to apply opacity to an entire sprite batch.

A feature of the framework’s game states system is the animated transitioning of a state’s activation and deactivation. I desired to have entire user interfaces (controls and all) be able to fade in or out during these transitions (if so configured).

ScreenState.cs Draw Override
/// <inheritdoc />
public override void Draw(SpriteBatch spriteBatch)
    // Activation progress is exposed by the ActivationPercentage property.
                      blendState: BlendState.AlphaBlend,
                      samplerState: SamplerState.PointClamp,
                      rasterizerState: new RasterizerState { ScissorTestEnable = true });


The code begins a sprite batch operation and then sends it off to the _screen, which will itself be passing it to potentially many, many controls. It would sure be nice to have a way to set the opacity to a value based on the mentioned ActivationPercentage property at the top here.

One might quickly notice that SpriteBatch.Begin accepts an effect named parameter that allows us to provide a different Effect type to use during rendering passes.

What if we try passing an instance of BasicEffect, which is a built-in type that exposes a lovely Alpha parameter for us to set!

Using BasicEffect — Won’t Work
/// <inheritdoc />
public override void Draw(SpriteBatch spriteBatch)
    var alphaEffect = new BasicEffect(spriteBatch.GraphicsDevice)
                      {   // Using a power curve for a less boring animation.
                          Alpha = (float) Math.Pow(ActivationPercentage, 3)

                      blendState: BlendState.AlphaBlend,
                      samplerState: SamplerState.PointClamp,
                      rasterizerState: new RasterizerState { ScissorTestEnable = true },
                      effect: alphaEffect);


Boy, that’d make for a nice short article if this worked. But, it won’t. More than likely, nothing will visibly render on the screen when this code runs.

Even though SpriteBatch allows us to provide it with an effect, it internally relies on the SpriteEffect type for things to actually end up being rendered correctly.

Using a general-purpose effect like BasicEffect, while perhaps sensible when dealing with vertex buffers and the like, isn’t going to fly at all with specialized renderers like SpriteBatch.

How We’ll Solve This

We’re on the right track. What we need is an effect that is like SpriteEffect, in that it does what needs to be done in order for SpriteBatch to work correctly, but one that also makes available an Alpha parameter, like BasicEffect.

So, we’ll make our own effect. This will consist of a managed Effect-based class that exposes an Alpha property which we can manipulate, and the HLSL code for the actual shader which will be running on our pixels to make them transparent.

Let’s start with the scary stuff: the HLSL shader code.

Custom Shader Code

Shaders are an advanced game development topic that I am starting to scratch the surface of myself. They are programs that (very simply put) run on every vertex/pixel our graphics card is rendering.

HLSL is the C-like language used when targeting DirectX with our shader; we use GLSL if we’re targeting OpenGL.

From what I understand, we will always use HLSL with MonoGame, even if we’re targeting OpenGL. This is because we process shader code with the mgfxc utility, which only accepts HLSL, but will convert the HLSL to GLSL using a parser if needed.

This parser used is MojoShader, which translates bytecode as opposed to source code, and one which (from what I understand) MonoGame is moving away from.

The reliance on MojoShader, coupled with the requirement to support many platforms, requires us to code our shaders using the effect format, which is technically deprecated now with newer versions of DirectX.

While it’s unfortunate to be learning to code using a deprecated format, we don’t have much of an option with MonoGame — it’s not a waste of time as far as learning goes, however, as the only difference between modern shader setups and effect files is that shaders would be in individual .hlsl files as opposed to all of them being in a big .fx file, combined through the defining of a technique.

What the Shader Needs to Do

With that rambling preamble concluded, let’s look at what our shader has to do:

  • Everything the stock SpriteEffect does in order for SpriteBatch draw operations to work.
  • Expose an Alpha parameter we can manipulate from our effect class.
  • Override the alpha of the color being used to reflect what Alpha is set to.

The first point directs us to start with the stock SpriteEffect.fx as the base for our shader code.

We can then take a look at how MonoGame exposes parameters on its built-in shaders, and use what we learn from that to add our own Alpha parameter.

Finally, how do we change the alpha of the colors we are rendering? The color data we’ll be working with is a float4 data type with the COLOR[N] semantic.

With that in mind, we can directly set the alpha channel’s value on a COLOR like so:

color.a = Alpha

However, this most likely isn’t going to give us the results we expect. For example, if Alpha was 0, you’d expect the texture being rendered to be completely transparent, yes? At runtime, you would see that is not the case.

The reason is because the default blending MonoGame uses expects opacity to be represented through the use of premulitplied alpha.

Premultiplied Alpha

Omni Suspicious...

I kept seeing the term “premultiplied alpha” being thrown around, and I had no idea what anyone was talking about. So, I did some reading.

When we typically think of alpha alongside the other color components, we’re normally thinking of straight alpha.

We interface with straight alpha all the time when using tools such as image editing software. For example, if we wanted to paint a lavender color with 50% transparency in Photoshop we’d slap #e6e6fa80 or (230, 230, 250, 0.5) into the color picker.

The 0.5 alpha value achieves 50% transparency and it is completely separate from the preceding RGB values. RGB and alpha values are independent when using straight alpha.

When using straight alpha:

  • RGB values specify the color of what’s being drawn.
  • The alpha value specifies the opacity.

With premultiplied alpha, RGB and alpha values are linked. In order to make an object transparent, you must not only reduce the alpha value, but also the RGB values.

When using premultiplied alpha:

  • RGB values specify how much color of what’s being drawn contributes to the output.
  • The alpha value specifies how much of what’s being drawn obscures whatever is behind it.

Often we think we dealing with straight alpha, but the program we’re using may very well be converting it to premultiplied alpha in the background. The reason why premultiplied alpha is used is because it gives better results when working with multiple layers.

By default, the textures and BlendState we’re using are going to be using premultiplied alpha, so we’ll need to use just that in our shader. If, for some reason, you are using straight alpha, then you’ll want to change your shader to simply set the alpha member of the color and leave the RGB values untouched.

The Shader Itself

SpriteBatch relies on MonoGame’s built-in SpriteEffect.fx shader to function correctly. Because of this, SpriteEffect.fx can be used as a template for our code.

For the draw operations of SpriteBatch to function correctly, our shader needs to expose a projection matrix parameter that it will use to transform vertex positions being inputted.

These positional transforms are required because SpriteBatch operates in 2D instead of the more typical 3D. I’d recommend consulting MonoGame source for more information.

So, we’ll write code that does all this, then tack on an additional Alpha parameter that will be multiplied against the color’s RGB values.

    #define _vs(r)  : register(vs, r)
    #define _ps(r)  : register(ps, r)
    #define _cb(r)
    #define VS_MODEL vs_3_0
    #define PS_MODEL ps_3_0

    #define END_PARAMETERS

    #define SAMPLE(texture, texCoord) tex2D(texture, texCoord) 

    sampler2D Texture : register(s0);
    #define _vs(r)
    #define _ps(r)
    #define _cb(r)
    #define VS_MODEL vs_4_0_level_9_1
    #define PS_MODEL ps_4_0_level_9_1

    #define BEGIN_PARAMETERS    cbuffer Parameters : register(b0) {
    #define END_PARAMETERS      };

    #define SAMPLE(texture, texCoord) texture.Sample(texture##Sampler, texCoord)

    Texture2D<float4> Texture : register(t0);
    sampler TextureSampler : register(s0);

    float4x4 MatrixTransform _vs(c0) _cb(c0);
    float Alpha _vs(c4) _cb(c4);
struct VSOutput
    float4 position : SV_Position;
    float4 color    : COLOR0;
    float2 texCoord : TEXCOORD0;

VSOutput SpriteVertexShader(    float4 position : POSITION0, 
                                float4 color : COLOR0, 
                                float2 texCoord : TEXCOORD0)
    VSOutput output;

    output.position = mul(position, MatrixTransform);
    output.color = color;
    output.color.a = Alpha;
    output.color.rgb *= Alpha;    
    output.texCoord = texCoord;
    return output;

float4 SpritePixelShader(VSOutput input) : SV_Target0
    return SAMPLE(Texture, input.texCoord) * input.color;

technique SpriteBatch
        VertexShader = compile VS_MODEL SpriteVertexShader();
        PixelShader = compile PS_MODEL SpritePixelShader();

The shader code shown above exposes the needed matrix transformation and alpha value parameters and can be targeted towards either OpenGL or DirectX (note: I tested the code in this article using OpenGL only; however, I’m reasonably confident it will run on DirectX, but no guarantees!).

Lines 42-55 contain our vertex shader: SpriteVertexShader. It transforms the input position using the configured projection matrix and then applies the configured alpha to our color in a manner that’s appropriate when working with colors that use premultiplied alpha (i.e., we’re multiplying the RGB values by the alpha).

Compiling the Shaders

We usually package custom shader effects with our game by adding them as assets through the MonoGame Content Pipeline. The content pipeline will compile the .fx files into platform-specific binary shader object files for us and also provide a means to load our custom effect assets as Effect instances at runtime.

We don’t want to do any of that for a few reasons:

  • Shaders meant to power SpriteBatch require additional managed logic in order to function correctly, therefore the base Effect class will not suit our purposes.
  • I created this effect for it to be used by components belonging to a library — a redistributable library needs to have the binary shader object packed into its assembly, lest we risk violating ye olde principle of least astonishment.

Instead of using the content pipeline, we’ll take matters into our own hands by first using the mgfxc utility to compile the shader, and then embedding the output as a resource in our assembly.

mgfxc Command Line
dotnet tool install --global dotnet-mgfxc
mgfxc AlphaSpriteEffect.fx AlphaSpriteEffect.mgfxo /profile:OpenGL

Substitute OpenGL with DirectX if you’re targeting that instead.

If you are authoring a library, and want support for multiple platforms baked into your assembly, you can follow MonoGame’s example and use *.dx11.mgfxo for DirectX-targeted shader objects and *.ogl.mgfxo for OpenGL-targeted shader objects.

You’ll then want to ensure your custom effect class (which we’ll be going over next) loads the proper bytecode based on the platform being targeted by the consuming project. The Bad Echo game framework is only targeting OpenGL (at least for now; who knows what the future has in store for us!).

Custom Effect Class

We now need to write a managed class that will be responsible for loading our compiled shader as well as shaping its input.

This is a fairly simple exercise. Our custom effect class needs to do the following:

  • Everything that the stock SpriteEffect class does.
  • Have support for an Alpha parameter, exposed as a writable property.
  • Load the bytecode compiled from AlphaSpriteEffect.fx.
/// <summary>
/// Provides a <see cref="SpriteBatch"/> effect that allows control over the alpha channel of all
/// sprites drawn in a batch.
/// </summary>
public sealed class AlphaSpriteEffect : Effect
    private EffectParameter _matrixParam;
    private EffectParameter _alphaParam;

    private Viewport _lastViewport;
    private Matrix _projection;

    /// <summary>
    /// Initializes a new instance of the <see cref="AlphaSpriteEffect"/> class.
    /// </summary>
    /// <param name="device">The graphics device used for sprite rendering.</param>
    public AlphaSpriteEffect(GraphicsDevice device)
        : base(device, Properties.Effects.AlphaSpriteEffect)

    /// <summary>
    /// Initializes a new instance of the <see cref="AlphaSpriteEffect"/> class.
    /// </summary>
    /// <param name="cloneSource">The <see cref="AlphaSpriteEffect"/> instance to clone.</param>
    private AlphaSpriteEffect(AlphaSpriteEffect cloneSource)
        : base(cloneSource)

    /// <summary>
    /// Gets or sets an optional matrix used to transform the sprite geometry.
    /// </summary>
    /// <remarks>
    /// A <see cref="Matrix.Identity"/> value is used if this is null.
    /// </remarks>
    public Matrix? MatrixTransform
    { get; set; }

    /// <summary>
    /// Gets or sets the transparency of all sprites drawn in a batch.
    /// </summary>
    /// <remarks>
    /// This is set to be fully opaque by default.
    /// </remarks>
    public float Alpha
    { get; set; } = 1f;

    /// <summary>
    /// Creates a clone of the current <see cref="AlphaSpriteEffect"/> instance.
    /// </summary>
    /// <returns>A cloned <see cref="Effect"/> instance of this.</returns>
    public override Effect Clone()
        => new AlphaSpriteEffect(this);

    /// <summary>
    /// Lazily computes derived parameter values immediately before applying the effect.
    /// </summary>
    protected override void OnApply()
        Viewport viewport = GraphicsDevice.Viewport;

        if (viewport.Width != _lastViewport.Width || viewport.Height != _lastViewport.Height)
        {   // 3D cameras look into the -z direction (z = 1 is in front of z = 0).
            // Sprite batch layers are ordered in the opposite (z  = 0 is in front of z = 1).
            // We correct this by passing 0 for zNearPlane and -1 for zFarPlane; essentially a
            // reverse mapping of the two.
                0, viewport.Width, viewport.Height, 0, 0, -1, out _projection);

            if (GraphicsDevice.UseHalfPixelOffset)
                _projection.M41 -= 0.5f * _projection.M11;
                _projection.M42 -= 0.5f * _projection.M22;

            _lastViewport = viewport;

        if (MatrixTransform.HasValue)
            _matrixParam.SetValue(MatrixTransform.GetValueOrDefault() * _projection);

    [MemberNotNull(nameof(_matrixParam), nameof(_alphaParam))]
    private void CacheEffectParameters()
        _matrixParam = Parameters[nameof(MatrixTransform)];
        _alphaParam = Parameters[nameof(Alpha)];

Putting It Together

Let’s rewrite our ScreenState.Draw override to make use of these new goodies.

Updated ScreenState.cs Draw Override
/// <inheritdoc />
public override void Draw(SpriteBatch spriteBatch)
    var alphaEffect = new AlphaSpriteEffect(_device)
                      {   // Using a power curve for a less boring animation.
                          Alpha = (float) Math.Pow(ActivationPercentage, 3)

                      blendState: BlendState.AlphaBlend,
                      samplerState: SamplerState.PointClamp,
                      rasterizerState: new RasterizerState { ScissorTestEnable = true },
                      effect: alphaEffect);



And voilà! It works.

Don’t believe me? Hmm…the hour grows late and I don’t have a super fancy and involved user interface on hand at the moment. So, lets just whip up an interface with a very simple control layout and throw it into a ScreenState:

Shows opacity being applied to an entire SpriteBatch.
The increasing alpha shown here is being applied to the entire SpriteBatch.

Even in this simple example, there are numerous components that are actually drawing to the single SpriteBatch being managed by the ScreenState instance.

And, of course, with more complex control layouts, we’ll have many, many more components drawing to said SpriteBatch. Regardless of the number of objects doing draw calls, the changing opacity we observe in the above image is only applied in one place: the root of the sprite batch operation itself.