As I’m writing this, the following video is being rendered. Framator is a very simple plugin, born out of lack of ideas, and it creates four rectangles around your screen. Now, you can go ahead and use them as the default frame settings, or play around with them and create some stunning Mograph.  

If you came here from the video, you’ll probably know that I, Chubak Bidpaa, have created this plugin. The parameters are self explanatory, but I’ll just give them a quick review just in case:

Group Uses
Height and Width The Height and the Width of the Rectangles
Offsets The position of the rectangles
Colors The color of the gradient
Angle The angle of the gradient
Stops The stops of the gradient

If you want an effect similar to mine, just use time in the angle’s expression box.

No more loitering. Here’s the download link.

Donation

I’m not sure if anyone wants to donate via Bitcoin, but in case you’ve enjoyed my plugin, and wish to give something back in return, use this Wallet to donate:

bitcoincash:qrt4fx3npgef7yczj6cxzfdfrtqztzekjqdf52rtgj

Thank you. Thank you all. Chubak Out.

This is not something that you’d find in a journal, or a book: this is something I myself have cooked up. But what do I mean by “Parametric Approach to OpenGL”, and by “Using After Effects”?

 

Browse below the latest posts to find a download link

The above video is an After Effects plugin by yours truly. Everything is based upon one fragment shader (which I got from a pastebin, but it looks like it’s from this repo), which as follows:

 

#version 330
uniform sampler2D videoTexture;
uniform float sliderVal;
uniform float multiplier16bit;
in vec4 out_pos;
in vec2 out_uvs;
out vec4 colourOut;

#define PI 3.14159265358979323846


float random (in vec2 _st) {
    return fract(sin(dot(_st.xy,
                         vec2(12.9898,78.233)))*
        43758.5453123);
}

vec2 truchetPattern(in vec2 _st, in float _index){
    _index = fract(((_index-0.5)*2.0));
    if (_index > 0.75) {
        _st = vec2(1.0) - _st;
    } else if (_index > 0.5) {
        _st = vec2(1.0-_st.x,_st.y);
    } else if (_index > 0.25) {
        _st = 1.0-vec2(1.0-_st.x,_st.y);
    }
    return _st;
}

void main() {
    vec2 st = gl_FragCoord.xy/sliderVal;
    st /= 10;
    

    vec2 ipos = floor(st);  // integer
    vec2 fpos = fract(st);  // fraction

    vec2 tile = truchetPattern(fpos, random( ipos ));

    float color = 0.0;

     color = step(tile.x,tile.y);

    gl_FragColor = vec4(vec3(color),1.0);
}

But you may say “so what, what’s so parametric about it?” Before doing anything or saying anything, let’s scrutinize the basis of an After Effects OpenGL plugin.

All AE OpenGL Plugins are based on one example in the SDK: Glator. This example was removed for a large amount of time, maybe 4-5 years, and returned in 2017. Without is, image maninpulation shall be done using Adobe’s own mangled suits, something I wouldn’t wish on my worst enemies. After Effects SDK is based on a few command selectors:

 

switch (cmd) {
			case PF_Cmd_ABOUT:
				err = About(in_data,
							out_data,
							params,
							output);
				break;
				
			case PF_Cmd_GLOBAL_SETUP:
				err = GlobalSetup(	in_data,
									out_data,
									params,
									output);
				break;
				
			case PF_Cmd_PARAMS_SETUP:
				err = ParamsSetup(	in_data,
									out_data,
									params,
									output);
				break;
				
			case PF_Cmd_GLOBAL_SETDOWN:
				err = GlobalSetdown(	in_data,
										out_data,
										params,
										output);
				break;

			case  PF_Cmd_SMART_PRE_RENDER:
				err = PreRender(in_data, out_data, reinterpret_cast<PF_PreRenderExtra*>(extra));
				break;

			case  PF_Cmd_SMART_RENDER:
				err = SmartRender(in_data, out_data, reinterpret_cast<PF_SmartRenderExtra*>(extra));
				break;
		}

Which are engulfed in a try...catch statement. You may have noticed some function calls below each command selector. Those are the bread and mead of an After Effects plgi. Our concern is with PreRender() and SmartRender(). PreRender() is not that important:

static PF_Err
PreRender(
	PF_InData				*in_data,
	PF_OutData				*out_data,
	PF_PreRenderExtra		*extra)
{
	PF_Err	err = PF_Err_NONE,
			err2 = PF_Err_NONE;

	PF_ParamDef slider_param;

	PF_RenderRequest req = extra->input->output_request;
	PF_CheckoutResult in_result;

	AEFX_CLR_STRUCT(slider_param);

	ERR(PF_CHECKOUT_PARAM(in_data,
		GLATOR_SLIDER,
		in_data->current_time,
		in_data->time_step,
		in_data->time_scale,
		&amp;slider_param));

	ERR(extra->cb->checkout_layer(in_data->effect_ref,
		GLATOR_INPUT,
		GLATOR_INPUT,
		&amp;req,
		in_data->current_time,
		in_data->time_step,
		in_data->time_scale,
		&amp;in_result));

	if (!err){
		UnionLRect(&amp;in_result.result_rect, &amp;extra->output->result_rect);
		UnionLRect(&amp;in_result.max_result_rect, &amp;extra->output->max_result_rect);
	}
	ERR2(PF_CHECKIN_PARAM(in_data, &amp;slider_param));
	return err;
}

You can see a call to PF_CHECKOUT_PARAM in the code. “Obtains parameter values, or the source video layer, at a specified time. AfterEffects makes caching decisions based on the checkout state of parameters.”1 This PARAM has been defined before in ParamSetup():

static PF_Err 
ParamsSetup (	
	PF_InData		*in_data,
	PF_OutData		*out_data,
	PF_ParamDef		*params[],
	PF_LayerDef		*output )
{
	PF_Err		err		= PF_Err_NONE;
	PF_ParamDef	def;	

	AEFX_CLR_STRUCT(def);

	PF_ADD_SLIDER(	STR(StrID_Name), 
					GLATOR_SLIDER_MIN, 
					GLATOR_SLIDER_MAX, 
					GLATOR_SLIDER_MIN, 
					GLATOR_SLIDER_MAX, 
					GLATOR_SLIDER_DFLT,
					SLIDER_DISK_ID);

	out_data->num_params = GLATOR_NUM_PARAMS;

	return err;
}

You can add as much PARAMs to your program as you wish and later, pass them to your shaders as uniforms. But how? Well, for that, we must see what goes on in SmartRender():

static PF_Err
SmartRender(
	PF_InData				*in_data,
	PF_OutData				*out_data,
	PF_SmartRenderExtra		*extra)
{
	PF_Err				err = PF_Err_NONE,
						err2 = PF_Err_NONE;

	PF_EffectWorld		*input_worldP = NULL,
						*output_worldP = NULL;
	PF_WorldSuite2		*wsP = NULL;
	PF_PixelFormat		format = PF_PixelFormat_INVALID;
	PF_FpLong			sliderVal = 0;

	AEGP_SuiteHandler suites(in_data->pica_basicP);

	PF_ParamDef slider_param;
	AEFX_CLR_STRUCT(slider_param);

	ERR(PF_CHECKOUT_PARAM(in_data,
		GLATOR_SLIDER,
		in_data->current_time,
		in_data->time_step,
		in_data->time_scale,
		&amp;slider_param));

	if (!err){
		sliderVal = slider_param.u.fd.value / 100.0f;
	}

	ERR((extra->cb->checkout_layer_pixels(in_data->effect_ref, GLATOR_INPUT, &amp;input_worldP)));

	ERR(extra->cb->checkout_output(in_data->effect_ref, &amp;output_worldP));

	ERR(AEFX_AcquireSuite(in_data,
		out_data,
		kPFWorldSuite,
		kPFWorldSuiteVersion2,
		"Couldn't load suite.",
		(void**)&amp;wsP));

	if (!err){
		try
		{
			// always restore back AE's own OGL context
			SaveRestoreOGLContext oSavedContext;

			// our render specific context (one per thread)
			AESDK_OpenGL::AESDK_OpenGL_EffectRenderDataPtr renderContext = GetCurrentRenderContext();

			if (!renderContext->mInitialized) {
				//Now comes the OpenGL part - OS specific loading to start with
				AESDK_OpenGL_Startup(*renderContext.get(), S_GLator_EffectCommonData.get());

				renderContext->mInitialized = true;
			}

			renderContext->SetPluginContext();
			
			// - Gremedy OpenGL debugger
			// - Example of using a OpenGL extension
			bool hasGremedy = renderContext->mExtensions.find(gl::GLextension::GL_GREMEDY_frame_terminator) != renderContext->mExtensions.end();

			A_long				widthL = input_worldP->width;
			A_long				heightL = input_worldP->height;

			//loading OpenGL resources
			AESDK_OpenGL_InitResources(*renderContext.get(), widthL, heightL, S_ResourcePath);

			CHECK(wsP->PF_GetPixelFormat(input_worldP, &amp;format));

			// upload the input world to a texture
			size_t pixSize;
			gl::GLenum glFmt;
			float multiplier16bit;
			gl::GLuint inputFrameTexture = UploadTexture(suites, format, input_worldP, output_worldP, in_data, pixSize, glFmt, multiplier16bit);
			
			// Set up the frame-buffer object just like a window.
			AESDK_OpenGL_MakeReadyToRender(*renderContext.get(), renderContext->mOutputFrameTexture);
			ReportIfErrorFramebuffer(in_data, out_data);

			glViewport(0, 0, widthL, heightL);
			glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
			glClear(GL_COLOR_BUFFER_BIT);
			
			// - simply blend the texture inside the frame buffer
			// - TODO: hack your own shader there
			RenderGL(renderContext, widthL, heightL, inputFrameTexture, sliderVal, multiplier16bit);

			// - we toggle PBO textures (we use the PBO we just created as an input)
			AESDK_OpenGL_MakeReadyToRender(*renderContext.get(), inputFrameTexture);
			ReportIfErrorFramebuffer(in_data, out_data);

			glClear(GL_COLOR_BUFFER_BIT);

			// swizzle using the previous output
			SwizzleGL(renderContext, widthL, heightL, renderContext->mOutputFrameTexture, multiplier16bit);

			if (hasGremedy) {
				gl::glFrameTerminatorGREMEDY();
			}

			// - get back to CPU the result, and inside the output world
			DownloadTexture(renderContext, suites, input_worldP, output_worldP, in_data,
				format, pixSize, glFmt);

			glBindFramebuffer(GL_FRAMEBUFFER, 0);
			glBindTexture(GL_TEXTURE_2D, 0);
			glDeleteTextures(1, &amp;inputFrameTexture);
		}
		catch (PF_Err&amp; thrown_err)
		{
			err = thrown_err;
		}
		catch (...)
		{
			err = PF_Err_OUT_OF_MEMORY;
		}
	}

	// If you have PF_ABORT or PF_PROG higher up, you must set
	// the AE context back before calling them, and then take it back again
	// if you want to call some more OpenGL.		
	ERR(PF_ABORT(in_data));

	ERR2(AEFX_ReleaseSuite(in_data,
		out_data,
		kPFWorldSuite,
		kPFWorldSuiteVersion2,
		"Couldn't release suite."));
	ERR2(PF_CHECKIN_PARAM(in_data, &amp;slider_param));
	ERR2(extra->cb->checkin_layer_pixels(in_data->effect_ref, GLATOR_INPUT));

	return err;
}

“But Chubak, where do we pass the uniforms?” Patience, Constance dear Patience. We pass them in RenderGL():

void RenderGL(const AESDK_OpenGL::AESDK_OpenGL_EffectRenderDataPtr&amp; renderContext,
				  A_long widthL, A_long heightL,
				  gl::GLuint		inputFrameTexture,
				  PF_FpLong			sliderVal,
				  float				multiplier16bit)
	{
		// - make sure we blend correctly inside the framebuffer
		// - even though we just cleared it, another effect may want to first
		// draw some kind of background to blend with
		glEnable(GL_BLEND);
		glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
		glBlendEquation(GL_FUNC_ADD);

		// view matrix, mimic windows coordinates
		vmath::Matrix4 ModelviewProjection = vmath::Matrix4::translation(vmath::Vector3(-1.0f, -1.0f, 0.0f)) *
			vmath::Matrix4::scale(vmath::Vector3(2.0 / float(widthL), 2.0 / float(heightL), 1.0f));

		glBindTexture(GL_TEXTURE_2D, inputFrameTexture);

		glUseProgram(renderContext->mProgramObjSu);

		// program uniforms
		GLint location = glGetUniformLocation(renderContext->mProgramObjSu, "ModelviewProjection");
		glUniformMatrix4fv(location, 1, GL_FALSE, (GLfloat*)&amp;ModelviewProjection);
		location = glGetUniformLocation(renderContext->mProgramObjSu, "sliderVal");
		glUniform1f(location, sliderVal);
		location = glGetUniformLocation(renderContext->mProgramObjSu, "multiplier16bit");
		glUniform1f(location, multiplier16bit);

		// Identify the texture to use and bind it to texture unit 0
		AESDK_OpenGL_BindTextureToTarget(renderContext->mProgramObjSu, inputFrameTexture, std::string("videoTexture"));

		// render
		glBindVertexArray(renderContext->vao);
		RenderQuad(renderContext->quad);
		glBindVertexArray(0);

		glUseProgram(0);
		glDisable(GL_BLEND);
	}

RenderGL(), like SwizzleGL(), is from a series of functions defined at the top of the file.

So what does it all amounts to? Go back to the top, to the very first listing, and you’ll see a sliderVal amongst the uniforms. That’s what I mean by Parametric OpenGL. Technically, every OpenGL is Parametric, however, this gives us a slider, or a point, or a value (depending on the type of the PARAM, I recommend reading the SDK manual). Ipso facto, parametric here means “something to mess around with”.

There’s a considerable amount of money in After Effects plugin development, and this is, perhaps, the very first blog post about this SDK. I’m not sure. When I was a kid, I spent my own fair share of the money on After Effects plugins. For a brief period in my life, they were my awe, my life and my livelihood. Make an After Effects Plugin, Make a Kid Happy!

I hope you enjoyed this very short post. I have a brouhaha with /r/EnglishLearning. They believed that my prose is, as my friend Tanami puts it, brash. If you believe so, please tell me so I can do something about it. Thank you.

 

I’ve found another book to mack on while you mack with your paramour. It’s called Fractal Worlds: Grown, Built and Imagined. I’m going to write a fractal generating software based on it, if I don’t die tomorrow. Chubak Out.

 

Further Reading

  • After Effects SDK Manual

This will be a very short entry, as I’ve already made a post, so forgive me for the brevity of this particular parcel.

The zest that drives me in the world of graphics programming is being able to create an application like Adobe’s After Effects. I have, in the past, had a very fun time with this software of which I am fond of. But in order to achieve such a feat, one must first release a plugin or two for the software. Me, I plan on writing a book on the subject. But that’s far into the distance.

But there are many blockades on the road which must be dealt with. First one is the fact that my Visual Studio can’t debug the damn plugins. I haven’t written any yet, but I can’t get Visual Studio debugger to work, IntelliTrace won’t show the value of any of the variables, even I I engage the plugin’s PiPL entry point, set many breakpoints, even in the wretched about() function. Therefore, I couldn’t have written any plugins even if I could, as blindly compiling the aex files back and forth is simply retarded.

I have many other problems as well, but they can be solved by reading the manual. And I’m reading it thoroughly, and carefully. Let me explain how this API works:

  1. Parameters
  2. Flags
  3. Suites

Parameters are the UI elements which you deal with in a plugin. They’re few, but applicable. Flags are orders which the plugin orders the application. And suites are a collection of functions that comprise the API functional libraries.

For example, here’s the render function of the Skeleton example:

 

 

static PF_Err 
Render (
	PF_InData		*in_data,
	PF_OutData		*out_data,
	PF_ParamDef		*params[],
	PF_LayerDef		*output )
{
	PF_Err				err		= PF_Err_NONE;
	AEGP_SuiteHandler	suites(in_data->pica_basicP);

	/*	Put interesting code here. */
	GainInfo			giP;
	AEFX_CLR_STRUCT(giP);
	A_long				linesL	= 0;

	linesL 		= output->extent_hint.bottom - output->extent_hint.top;
	giP.gainF 	= params[SKELETON_GAIN]->u.fs_d.value;
	
	if (PF_WORLD_IS_DEEP(output)){
		ERR(suites.Iterate16Suite1()->iterate(	in_data,
												0,								// progress base
												linesL,							// progress final
												&amp;params[SKELETON_INPUT]->u.ld,	// src 
												NULL,							// area - null for all pixels
												(void*)&amp;giP,					// refcon - your custom data pointer
												MySimpleGainFunc16,				// pixel function pointer
												output));
	} else {
		ERR(suites.Iterate8Suite1()->iterate(	in_data,
												0,								// progress base
												linesL,							// progress final
												&amp;params[SKELETON_INPUT]->u.ld,	// src 
												NULL,							// area - null for all pixels
												(void*)&amp;giP,					// refcon - your custom data pointer
												MySimpleGainFunc8,				// pixel function pointer
												output));	
	}

	return err;
}

The functions takes a reference construct, in_data and out_data which are the given data, and the generative data, and params array which is the parameters of the UI, and finally, a layerdef which is the layer that effect has been applied to it.

At first, it creates a GainInfo variable which later will serve as our reference construct, does some calculations on it, and later passes it as the refcon to our two 8bit an 16bit functions. They in turn, cast the refcon into another variable type, and place some calculations on it, and carry the operation.

This is the basis of how AE plugins do what they do. Of course, there are some plugin types such as AEGP (which element 3D falls under). Element 3D for example, uses Assimp OpenGL Model Loader to load models, then dump the framebuffer into an image, and pass it into After Effects.

That is it for the moment. I hope you’ve enjoyed this post. I will keep you updated on my exploits. And my first plugin will definitely be free.