Digging more into the Molehill APIs

A few months ago, we announced at Max 2010 in Los Angeles, the introduction of the Molehill APIs in the Adobe Flash runtimes on mobile and desktop. For more infos check the “Molehill” page on Adobe Labs. I wanted to give you guys more details about Molehill, some more technical details on how it is going to work from an ActionScript developer standpoint.

So let's get started ;)

What is Molehill?

“Molehill’ is the codename for the set of 3D GPU accelerated APIs that will be exposed in ActionScript 3 in the Adobe Flash Player and Adobe AIR. This will enable high-end 3D rendering inside the Adobe Flash Platform. Molehill will rely on DirectX9 on Windows, OpenGL 1.3 on MacOS and Linux. On mobile platforms like Android, Molehill will rely on OpenGL ES2. Technically, the Molehill APIs are truly 3D GPU programmable shader based, and will expose features that 3D developers have been looking for since a long time in Flash, like programmable vertex and fragment shaders, to enable things like vertex skinning on the GPU for bones animation but also native z-buffering, stencil color buffer, cube textures and more.

In terms of performance, Adobe Flash Player 10.1 today, renders thousands of non z-buffered triangles at approximately 30 Hz. With the new 3D APIs, developers can expect hundreds of thousands of z-buffered triangles to be rendered at HD resolution in full screen at around 60 Hz. Molehill will make it possible to deliver sophisticated 3D experiences across almost every computer and device connected to the Internet. To get an idea of how Molehill performs and see a live demo check this video.

The way it works.

The existing Flash Player 2.5D APIs that we introduced in Flash Player 10 are not deprecated, the Molehill APIs will offer a solution to advanced 3D rendering requiring full GPU acceleration. Depending on the project that you will be working on, you will decide which APIs you want to use.

We introduced recently the concept of “Stage Video” in Flash Player 10.2 available as a beta on Adobe Labs.
Stage Video relies on the same design, by enabling full GPU acceleration for video, from decoding to presentation. With this new rendering model, the Adobe Flash Player does not present the video frames or 3D buffer inside the display list but inside a texture sitting behind the stage painted through the GPU. This allows the Adobe Flash Player to directly paint on screen the content available on the graphics card memory. No more read back is required, to retrieve the frames from the GPU to push them on screen through the display list on the CPU.

As a result, because the 3D content sits behind the Flash Player stage and is not part of the display list, the Context3D and Stage3D objects are not display objects. So remember that you cannot interact with them just like with any DisplayObject, rotations, blend modes, filters and many other effects cannot be applied.

The following figure illustrates the idea:

Stage3D Model

Of course, as you can see, 2D content can overlay the 3D content with no problems, but the opposite is not possible. However, we will provide an API which will allow you to draw your 3D content to a BitmapData if required. From an ActionScript APIs standpoint, as a developer you interact with the two main objects, a Stage3D and a Context3D object. You request to the Adobe Flash Player a 3D context and a Context3D object will be created for you. So now you may wonder, what happens if the GPU driver is incompatible, do I get a black screen failing silently?

The Flash Player will still return you a Context3D object but using software fallback internally, so you will still get all the Molehill features and same API but running on the CPU. To achieve, this we rely on a very fast CPU rasterizer from TransGaming Inc. called “SwiftShader”. The great news is that even when running on software, SwiftShader runs about 10 times faster than today’s vector rasterizer available in Flash Player 10.1, so you can expect some serious performance improvements even when running in software mode.

The beauty of “Molehill” APIs is that you do not have to worry what is happening internally. Am I running on DirectX, OpenGL or SwiftShader? Should I use a different API for OpenGL when on MacOS and Linux or OpenGL ES 2 when running on a mobile platform? No, everything is transparent for you as a developer, you program one single API and the Adobe Flash Player will handle this for you internally and do the translation behind the scene.

It is important to remember, that the Molehill APIs do not use what is called a fixed function pipeline but a programmable pipeline only, which means that you will have to work with vertex and fragment shaders to display anything on screen. For this, you will be able to upload on the graphics card your shaders as pure low-level AGAL (“Adobe Graphics Assembly Language”) bytecode as a ByteArray. As a developer you have two ways to do this, write your shaders at the assembly level, which requires an advanced understanding of how shaders works or use a higher-level language like Pixel Bender 3D which will expose a more natural way to program your shaders and compile for you the appropriate AGAL bytecode.

In order to represent your triangles, you will need to work with VertexBuffer3D and IndexBuffer3D objects by passing vertices coordinates and indices, and once your vertex shaders and fragment shaders are ready, you can upload them to the graphics card through a Program3D object. Basically, a vertex shader deals with the position of the vertices used to draw your triangles whereas a fragment shader handles the appearance of the pixels used to texture your triangles.

The following figure illustrates the difference between the types of shaders:



As stated before, Molehill does not use a fixed function pipeline, hence developers will be free to create their own custom shaders and totally control the rendering pipeline. So let’s focus a little bit on the concept of vertex and fragment shaders with Molehill.

Digging into vertex and fragment shaders

To illustrate the idea, here is a simple example of low-level shading assembly you could write to display your triangles and work at the pixel level with Molehill. Let's get ready, cause we are going to go very low-level and code shaders to the metal ;). Of course if you hate this, do not worry, you will be able to use a higher-level shading language like Pixel Bender 3D.

Note : To compile the assembly String to AGAL bytecode, download the AGALMiniAssembler here.

To create our shader program to upload to the graphics card; we need first a vertex shader (Context3DProgramType.VERTEX) which should at least output a clip-space position coordinate. To perform this, we need to multiply va0 (each vertex position attributes) by vc0 (vertex constant 0) our projection matrix stored at this index and output the result through the op keyword (standing for "output position" of the vertex shader):

// create a vertex program - from assembly
var vertexShaderAssembler : AGALMiniAssembler = new AGALMiniAssembler();

vertexShaderAssembler.assemble( Context3DProgramType.VERTEX,
"m44 op, va0, vc0 \n" // 4x4 matrix transform from stream 0 (vertex position) to output clipspace

Now you may wonder, what is this m44 thing? Where does it comes from?

It is actually a 4x4 matrix transformation, it projects our vertices according to the projection matrix we defined, we could have written our shader like the following, by manually calculating the dot product on each attribute, but the m44 instruction (performing a 4x4 matrix transform on all attributes in one line) is way shorter:

// create a vertex program - from assembly
var vertexShaderAssembler : AGALMiniAssembler = new AGALMiniAssembler();

vertexShaderAssembler.assemble( Context3DProgramType.VERTEX,
"dp4 op.x, va0, vc0 \n" + // 4x4 matrix transform from stream 0 (vertex position) to output clipspace
"dp4 op.y, va0, vc1 \n" +
"dp4 op.z, va0, vc2 \n" +
"dp4 op.w, va0, vc3 \n" +

Remember that vc0 (vertex constant 0), it is actually just our projection matrix stored in this index, passed earlier as a constant through the setProgramsConstantMatrix API on the Context3D object:

context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 0, modelMatrix, true );

As with our matrix constant, va0 (vertex attributes 0) for the position, needs to be defined, and we did this through the setVertexBufferAt API on the Context3D object.

context3D.setVertexBufferAt (0, vertexbuffer, 0, Context3DVertexBufferFormat.FLOAT_3 );

In our example, the vertex shader passes the vertices color (va1) to the fragment shader through v0 and the mov instruction to actually paint our triangles pixels. To do this, we could write the following:

// create a vertex program - from assembly
var vertexShaderAssembler : AGALMiniAssembler = new AGALMiniAssembler();

vertexShaderAssembler.assemble( Context3DProgramType.VERTEX,
"m44 op, va0, vc0 \n" +	// 4x4 matrix transform from stream 0 (vertex position) to output clipspace
"mov v0, va1 \n"  	// copy stream 1 (vertex color) to fragment shader

And as you can imagine, va1 (vertex attributes 1) for the color was defined through setVertexBufferAt, to expose our pixel colors (float 3) in the shaders:

context3D.setVertexBufferAt( 1, vertexbuffer, 3, Context3DVertexBufferFormat.FLOAT_3 );

Our vertices position and colors are defined into our VertexBuffer3D object :

// create a vertex buffer
// format is (x,y,z,r,g,b) = 3 vertices, 6 dwords per vertex
vertexbuffer.uploadFromVector ( Vector.<Number>([
-1,-1,0,  255/255,0,0, 				// red
0,1,0,    193/255,216/255,47/255, 	// green
1,-1,0,   0,164/255,228/255       	// blue
]),0, 3 ); // start at offset 0, count 3

We have our vertex shader defined, now we need to define and upload our fragment shader (Context3DProgramType.FRAGMENT), the idea is to retrieve each vertex color passed (copied from va1 to v0) and output this color through the oc opcode:

var fragmentShaderAssembler : AGALMiniAssembler= new AGALMiniAssembler();
fragmentShaderAssembler.assemble( Context3DProgramType.FRAGMENT,
"mov oc, v0" // output color

As you can imagine, a fragment shader should always output a color. Then, we need to upload all this to the Context3D object:

// upload the AGAL bytecode
program = context3D.createProgram();
program.upload( vertexShaderAssembler.agalcode, fragmentShaderAssembler.agalcode );

If we compile and run those shaders, we would get the following result:

Hello Triangle

Now, let’s say we need to invert the colors of each pixel, it would be really easy. As this operation is performed on the pixels color only, we would just modify our fragment shader and use the sub opcode to subtract the color, as following:

var fragmentShaderAssembler : AGALMiniAssembler= new AGALMiniAssembler();
fragmentShaderAssembler.assemble( Context3DProgramType.FRAGMENT,
"sub ft0, fc1, v0 \n" + // subtract the color ( 1 - color)
"mov oc, ft0" // output color

Here, we invert the color of each pixel by subtracting each pixel color from 1 (white). The white pixel we subtract from is stored in a fragment constant (fc1) that we passed by using the setProgramConstantsFromVector API:

context3D.setProgramConstantsFromVector(Context3DProgramType.FRAGMENT, 1, Vector.( [ 1, 1, 1, 1 ] ) );

The final pixel color is then stored in a fragment temporary register (ft0) and passed as the final output color.

By using this modified fragment shader, we end up with the following result:

Hello Triangle Inverted

As another exercice, let's process a sepia-tone filter.

To achieve this, we need to first convert to gray scale then to sepia. We would use the following fragment shader for this:

var fragmentShaderAssembler : AGALMiniAssembler= new AGALMiniAssembler();
fragmentShaderAssembler.assemble( Context3DProgramType.FRAGMENT,
"dp3 ft0, fc1, v0  \n" + // convert to grayscale
"mul ft1, fc2, ft0 \n" + // convert to sepia
"mov oc, ft1" // output color

As usual, we would have defined our constants using the setPrograConstantsFromVector API:

// grayscale
context3D.setProgramConstantsFromVector(Context3DProgramType.FRAGMENT, 1, Vector.<Number>( [ 0.3, 0.59, 0.11, 1 ] ) );
// sepia
context3D.setProgramConstantsFromVector(Context3DProgramType.FRAGMENT, 2, Vector.<Number>( [ 1.2, 1.0, 0.8, 1 ] ) );

By using such a fragment shader, we would end up with the following result:

Hello Triangle Sepia

As you can imagine, this gives you a lot of power and will allow you to go way further in terms of shading and handle things like lightning through vertex or fragment shading, fog, or even animation through vertex skinning and even more.

Ok, so last one, let's now apply a texture to our triangle from a BitmapData, to do this, we would need to pass uv values from our vertex shader to our fragment shader and then use those values to apply our texture that we sampled in our fragment shader.

To pass the uv values, we would need to modify our vertex shader this way :

vertexShaderAssembler = new AGALMiniAssembler();
vertexShaderAssembler.assemble( Context3DProgramType.VERTEX,
"m44 op, va0, vc0    \n" + // 4x4 matrix transform from stream 0 to output clipspace
"mov v0, va1         \n"   // copy texcoord from stream 1 to fragment program

Our uv coordinates are now copied from va1 to v0, ready to be passed to the fragment shader. Notice that we do not pass any vertex color anymore to the fragment shader, just the uv coordinates.

As expected, we defined our uv values for each vertex (float2) through va1 with setVertexBufferAt :

context3D.setVertexBufferAt( 1, _vertexBuffer, 2, Context3DVertexBufferFormat.FLOAT_2 );

Our vertices position and uv values are defined into our VertexBuffer3D object :

// x,y,u,v
-1,-1, 0,1,
0,1, 1,0,
1,-1, 1,1,
0, 3

Then we retrieve the uv values in our fragment shader and sample our texture :

fragmentShaderAssembler.assemble( Context3DProgramType.FRAGMENT,
"mov ft0, v0 \n"+
"tex ft1, ft0, fs1 <2d,clamp,linear> \n"+ // sample texture 1
"mov oc, ft1 \n"

To define our texture, we instantiate our BitmapData, upload it to a Texture object and upload it to the GPU:

texture = context3D.createTexture( 256, 256, Context3DTextureFormat.BGRA, false );
var bitmap:Bitmap = new MolePeopleBitmap();
texture.uploadFromBitmapData( bitmap.bitmapData );

And then to access it from fs1, we set it:

context3D.setTextureAt( 1, texture );

By using this modified shader program, we end up with this :

Textured Triangle

I will cover in later tutorials new effects like per-fragment fog or heat signature with texture lookup too.

Of course, we just covered here how shaders work with Molehill. To control your pixels you need triangles and vertices and indices defining them in your scene. For this, you will need other objects like VertexBuffer3D and IndexBuffer3D, attached to your Context3D object.

The following figure illustrates the overall interaction of objects:

Molehill Architecture

As you can see, the Molehill APIs are very low-level and expose features for advanced 3D developers who want to work at such a level with 3D. Of course, some developers will prefer working with higher-level frameworks, which exposes ready to go APIs, and we took care of that.

Building mountains out of Molehill

We know that many ActionScript 3 developers would prefer having a light, a camera, a plane to work with rather than a vertex buffer and shaders bytecode. So to make sure that everyone can enjoy the power of Molehill, we are actively working with existing 3D frameworks like Alternativa3D, Flare3D, Away3D, Minko, Sophie3D, Yogurt3D and more. Today most of these frameworks are already Molehill enabled and will be available for you when Molehill is available in a next version of the Adobe Flash runtimes.

Most of the developers from these frameworks were at Max this year to present sessions about how they leveraged Molehill in their respective framework. We expect developers to build their engine on top of Molehill, hence, advanced 3D developers and non-3D developers will be able to benefit from Molehill.

I hope you enjoyed this little deep dive into Molehill, stay tuned for more Molehill stuff soon ;)

Comments (56)

  1. Jerome wrote:

    Ouais c’est sympa tout ça, mais pour la vie de tous les jours, un Scale9 sur les bitmaps ce serait vraiment bien :)

    Thursday, January 6, 2011 at 11:35 pm #
  2. Si ++ wrote:

    Please release a beta. Please. Pretty please!


    Friday, January 7, 2011 at 12:02 am #
  3. Euh…oui pour la vie de tous les jours?… Non demain je regarde !

    Friday, January 7, 2011 at 1:03 am #
  4. Thibault Imbert wrote:


    C’est vrai ca, et je n’ai pas oublie! ;)


    Friday, January 7, 2011 at 3:51 am #
  5. Jarrad wrote:

    alot(if not all) of this information, including the previous video, is just rehashed.

    Friday, January 7, 2011 at 5:00 am #
  6. Thibault Imbert wrote:

    Hi Jarrad,

    You mean the Molehill FAQ video ?

    I wanted to have this article covering in more details the AS3 APIs and the shaders work which was not covered in the previous video.


    Friday, January 7, 2011 at 5:39 am #
  7. is there any good reference to learn low level 3D terminologies?

    Friday, January 7, 2011 at 7:10 am #
  8. MoleHill I need to climb a lot….

    Friday, January 7, 2011 at 8:45 am #
  9. Nice introduction! I’m wondering… when Molehill’s shader language (AGAL) is targetting mobile platforms (via OpenGL ES 2.0), how does it deal with precision of computations? That is, is there a way to express highp/mediump/lowp types in AGAL?

    Friday, January 7, 2011 at 9:34 am #
  10. Reading again again I can understand some thing.. Create Vertex Shader-takes care the 3D to 2D conversion of points, Calculating tangents . and Pixel Shader (Fragment Shader) –Which takes care of coloring of each pixel . Combining Both we can render a 3D object. Need to learn more about basics…

    Friday, January 7, 2011 at 10:15 am #
  11. maru wrote:

    Ouch !! Scary :/
    But really interesting to boost 2D performances though…
    Hopefully, there will be third party libraries for 2D that will wrap those low-level API into something more friendly for AS3 dev.

    Friday, January 7, 2011 at 11:10 am #
  12. zproxy wrote:

    I wonder if flash weould allow WebGL shaders and vertex programs to run :)

    Friday, January 7, 2011 at 1:53 pm #
  13. Jarrad wrote:

    @Thibault: Its great to have this information in blog form, but alot of it was covered in Sebastian’s session and your previous blog posts/video’s(last one was hilerious btw :P )

    Althought I know have to wait, I’m ready to get my hands dirty :) or at least learn something new :)

    Saturday, January 8, 2011 at 5:22 am #
  14. OJ wrote:

    Grate! What about mouse events based interactions with this Context3d object – how do we do that? And how to draw and texture a ball with this funny assembler dialect?)

    Sunday, January 9, 2011 at 4:37 am #
  15. Si ++ wrote:


    Start researching and learning 3D programming, or use one of the high-level APIs that will be available. Molehill simply provides low-level access to the GPU, it isn’t a 3D framework.


    Sunday, January 9, 2011 at 5:26 pm #
  16. Héctor wrote:

    Thanks for the post Thibault, very informative. A doubt I was having is now solved, and I’m seeing Molehill is more low level than what I thought.

    Keep up the great work!

    Monday, January 10, 2011 at 10:15 am #
  17. Cihan Özçelik wrote:

    What about performance of getting webcam stream(or any video stream) and use it as a texture for augmented reality? If we can’t place any displayobject behind 3D content we should at least have a textured plane behind everything.

    Monday, January 10, 2011 at 6:04 pm #
  18. Thibault Imbert wrote:

    Yep OJ, all this would need to be coded on top of Molehill through an AS3 framework.

    Cihan Özçelik,

    Yes, you can totally do that, I have an example doing this.You will also be able to rasterize your 3D scene to a BitmapData object with a drawToBitmapData helper API.


    Tuesday, January 11, 2011 at 3:42 am #
  19. Gary Paluk wrote:

    Hi Thibault, I am very keen to apply to get onto the Molehill beta program. I’m working on a sophisticated 3D engine of my own and was in talks with Swift3D who stated that they would put me forward as a recommendation since they are on the program. Please could you let me know if this is or isn’t a viable option? ^_^

    Tuesday, January 11, 2011 at 5:23 am #
  20. Y.Boy wrote:

    Good.Flash 3D game is coming…

    Tuesday, January 11, 2011 at 7:38 am #
  21. telemaque wrote:

    more stuff ! more stuff please !!! ça fait des années qu’on attend ça pour Flash !!! merci en tout cas pour ce premier insight :-)

    Tuesday, January 11, 2011 at 11:25 pm #
  22. sHTiF wrote:

    Just 2 questions.

    First do we know when the API is running on SwiftShader and when its running on GPU by some kind of property or callback so we can modify scene or let user know accordingly?

    Second when the hell will the beta be out so i can kick some serious butt :D

    Thursday, January 13, 2011 at 12:17 am #
  23. Thibault Imbert wrote:

    Hi sHTiF,

    Yes, you will be able to detect that and adapt your content appropriately.

    Beta ? Soon! ;)


    Thursday, January 13, 2011 at 12:19 am #
  24. Thibault Imbert wrote:

    Hi Gary,

    Please send me an email offline to talk about this.



    Friday, January 14, 2011 at 9:19 am #
  25. Si ++ wrote:


    Standard 2D Flash content won’t automatically be pushed through Molehill AFAIK, you will need to use the Molehill API to render 3D/GPU accelerated 2D content.

    The only real difference between 2D/3D/Isometric rendering when using Molehill is the 4×4 projection matrix you use.


    Please correct me if I’m wrong, Thibault

    Sunday, January 16, 2011 at 8:08 am #
  26. bwhiting wrote:

    yo, just a quick question

    will molehill support only one uv/normal/colour per vertex? i.e. will there be a way to have multiple uvs without having to duplicate the vertices?


    Monday, January 17, 2011 at 2:14 pm #
  27. joeydee wrote:

    Hi Thibault, thanks for this great preview into Molehill, awaiting more of it :-)

    Three questions up to now:

    1) When uploading the projection matrix via
    context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 0, modelMatrix, true );
    is modelMatrix a FP10 Matrix3D object, a Vector.(16) or something different?

    2) Will there be an API to create a proper projection matrix (fov,ar,near,far), or do we have to calculate it on our own (no problem with that, just wanted to know)?

    3) what does the “true” mean when uploading the matrix?


    Monday, January 17, 2011 at 5:11 pm #
  28. Gary Paluk wrote:

    Hey, I sent you an email Thibault thanks, awaiting your response :) In the mean time I have a relativly in-depth question. I notice that you are using Vector.( [...] ) for vertex and index values and using the BitmapData type for textures. Will Molehill allow multiple types of textures that facilitate various bit-depth level data and contained in perhaps a ByteArray (for example a stencil buffer needs less data than a texture)?? If this is the case is the same principle able to be applied for the index and vertex buffers and them be provided to Molehill as a ByteArray block? Maybe with the fast memory access as with Alchemy introduced in this itteration of the FlashPlayer?!?!, this would provide a much faster mechanism for buffer management and steamline the ‘data as memory block (aka buffers) concept’ that openGL developers are used to?

    Wednesday, January 19, 2011 at 10:42 am #
  29. Thibault Imbert wrote:

    Hi Joeydee,

    1. Yes, a FP10 Matrix3D object.
    2. No, you will have to create this yourself, yes :)
    3. A boolean to specify if the matrix is transposed or not.

    Hope this helps!


    Wednesday, January 19, 2011 at 8:09 pm #
  30. joeydee wrote:

    Yes this helps, thanks a lot :-)

    Thursday, January 20, 2011 at 9:56 am #
  31. OK Thibault.

    I’m very excited with Molehill until now.

    You really are looking to OpenGL (seems more specifically with OpenGL ES 2.0) programmable pipeline and extracting the it’s ideia.

    GREAT!!!! This is amazing!!!

    I’m an Apple Developer too and I just work with OpenGL.

    Right, now I have a question…

    Why the hell the Adobe are trying to make this creating a new Shader Language (AGAL)????
    Why don’t use OpenGL SL??? It’s OPEN!!!!

    Oh right, to work with DirectX on PCs Adobe needs to convert, or something like that, but the SL has much much more readability than this kind of AGAL!!!

    You know, what the hell is all this op, vc, ft, ff, fa, si sa su, mi mo mu…. for GOD!!!

    Come on Thibault! You know I’m right!
    Is much much better work with “gl_Position” “gl_FragColor” “gl_FrontFacing” “uniform” “varying”!!!

    And plz… don’t tell me about Pixel Blender!!! This is for beginners!! Real developers loves Shaders and SL!!!

    Plz Thibault, I’ll appreciate very much if you introduce a more readability Shader Language into Molehill.

    Sunday, January 23, 2011 at 3:44 am #
  32. Thibault Imbert wrote:

    Hi Gary,

    Yes, you will be able to supply textures through bytes too with BitmapData objects as ByteArray but also other type of compressed texture formats to save a lot of space on the GPU but also provide mip maps.

    More on that soon ;)


    Monday, January 24, 2011 at 9:51 am #
  33. Gary Paluk wrote:

    Thanks Thibault, that’s perfect and I think what you’re hinting at with the mip-mapping is how I have implemented it :)

    Monday, January 24, 2011 at 12:48 pm #
  34. Cedric wrote:

    “The beauty of “Molehill” APIs is that you do not have to worry what is happening internally. Am I running on DirectX, OpenGL or SwiftShader?”

    Oui mais comme le rendu software roule 10x moins vite que le hardware, ce sera bien de mettre une propriété en lecture seule qui informe sur quoi on roule pour pouvoir adapter au runtime le qualité contenu en conséquence.

    Thursday, January 27, 2011 at 4:03 pm #
  35. Thibault Imbert wrote:

    Hi Aras,

    Yes, with Molehill we always do high.


    Yes, tu pourra detecter cela :)


    Thursday, February 17, 2011 at 4:00 am #
  36. joeydee wrote:

    Hi Thibault,
    in the texture AGAL code:
    tex ft1, ft0, fs1
    how is the bracketed information translated into bytecode? Since there is only opcode, destination, sourceA and sourceB with fixed byte lengths. Perhaps a different opcode for every possible combination? How do you handle this?

    Tuesday, February 22, 2011 at 10:22 am #
  37. joeydee wrote:

    p.s. I talk about the information “2d,clamp,linear”

    Tuesday, February 22, 2011 at 10:24 am #
  38. Pedram wrote:

    holy cow!

    Sunday, February 27, 2011 at 11:39 pm #
  39. Hey Thibault,

    Great article, thanks!

    When are we going to get our hands on Pixel Bender 3D? I’m trying to implement a distance fog shader but writing directly in assembly is kicking my ass (I’ve managed to do it per-polygon but not per-vertex yet).

    Monday, February 28, 2011 at 11:28 pm #
  40. kutu wrote:

    hi thibault,

    i spent many hours, but cann’t recreate your example.
    plz upload source code of this example.


    Tuesday, March 1, 2011 at 10:21 am #
  41. alijaya wrote:

    hmmm… i think you have forgotten to write
    context3D.setVertexBufferAt( 0, _vertexBuffer, 0, Context3DVertexBufferFormat.FLOAT_2 );
    when you tell how to use bitmapdata
    I think it must FLOAT_2, but the article doesn’t note it, it seems it must be FLOAT_3

    am i right? :-?

    Saturday, March 5, 2011 at 5:51 pm #
  42. Zhen Ju wrote:

    I’m having some troubles running MaxRacer, my graphics card is Radeon 5770, supports dx9, dx10 and even dx11, but when I’m runing maxracer, the CPU accupation is very high(over 90%), it’s obvious that the GPU is not working with maxracer, how can i solve the problem? (waiting for you advice ^_<)

    Thursday, March 10, 2011 at 5:05 am #
  43. Zhen Ju wrote:

    My video card is Radeon 5770, it can support DX9, but when i’m running maxracer, the cpu is very busy(over 90% by the game), it’s obvious that the video card is not working, is there some way i can solve this problem?

    Thursday, March 10, 2011 at 5:14 am #
  44. Zhen Ju wrote:

    Hey Thibault,
    I’m trying to test your code in this paper, I’ve installed the fp11 reqired tools(fp11.xml,playerglobal.swfc), all thoese needed for flash ide professional. But I encounter a strange problem! It reminds me that flash.display3D::Context3D can not be found! But I’ve put the files into the right place(I’m having the correct code hints and fp11 publish settings), is there something more I should do?

    Friday, March 11, 2011 at 9:51 am #
  45. Zhen Ju wrote:

    Hey Thibault,
    I’ve solved the previous problem! It’s because we have’nt yet had a standalone flash player 11, so i should try this program in web browsers! I’m trying to write the code in my flash cs5, and then use publish preview(f12) to test it~ well, it’s a tough job for me to checkout the documentation[I'm Chinese, not very good at English :(], but i’m gonna try hard!

    Friday, March 11, 2011 at 2:19 pm #
  46. Pleh wrote:

    Hi Thibault,

    Not sure if you know the answer to this or not but I thought I would ask anyway…

    Can you use molehill with the flash packager for iphone? Or are there any plans to enable this?


    Thursday, March 17, 2011 at 7:57 pm #
  47. Michiel Brinkers wrote:

    Will the AGAL compiler eventually be integrated into the flash player/playerglobals? Seems like functionality which would benefit from native implementation.

    Friday, March 25, 2011 at 11:48 am #
  48. BillO wrote:

    I heard that Unity was developing an export for Flash that utilizes this new technology. Unity has a browser plug-in, but being able to use Unity and the Flash player is a much better solution.

    Friday, March 25, 2011 at 3:31 pm #
  49. Michiel Brinkers wrote:

    Thibault, can you maybe explain why Context3D.drawToBitmapData is so slow? 33 ms for copying a 256*256 image buffer seems rather long.
    Is there some problem with copying from the video memory back to RAM? Because once it’s in RAM I would expect the copy to a BitmapData to be really fast.

    Friday, March 25, 2011 at 5:51 pm #
  50. C’est tres prometteur tout ca! Je m’attaque tout de suite a MoleHill!

    Sourigna (EPITA 2011)

    Saturday, April 16, 2011 at 10:05 am #
  51. Neil wrote:

    Without any framework wrapping this stuff it will be out of reach for most Flash Developers.

    Thursday, June 30, 2011 at 10:27 am #
  52. Gildas wrote:

    Have you considered accessibility at all? Is there any concept of providing text equivalent assignments of object models that can be passed to the flash player and the operating system’s accessibility APIs?

    Thursday, July 14, 2011 at 5:36 pm #
  53. Salut Thibault,
    I have a question – how do StageVideo & Stage3D work together? I’m looking to play a video with Stage Video and overlay 3D on top of it with an engine such as Away3D. If this is not possible, and Away3D needs to be software rendered, how can I use Flash 3D engines like this after they have been rewritten to use Stage3D? Merci!

    Saturday, July 23, 2011 at 5:41 pm #
  54. Thibault Imbert wrote:

    Hi Kevin,

    This is possible, you will notice that in FP11 public beta, we also introduced a new feature. Stage3D can now be transparent, meaning that you can have a StageVideo surface in the back and Stage3D on top completely HW accelerated.


    Saturday, July 23, 2011 at 7:43 pm #
  55. Thank you Thibault, and that is AWESOME!!!
    I tried getting it to work, but I don’t think I’m doing it correctly (because Away3D’s bg is still white).

    for (var i:uint=0; i<stage.stage3Ds.length; i++)
    this.stage.stage3Ds[i].transparent = true;

    Saturday, July 23, 2011 at 9:12 pm #
  56. fourfire wrote:

    There is a very painful problem, when i lock my computer,then open it, my flashplayer11 throw a error :
    Error: Error #3694: The object was disposed by an earlier call of dispose() on it.
    track it by debug and find the Context3D is null, so i create other one by requestContext3D()method of Stage3D class,but there is a error, Error #3600 : No valid program set

    Friday, August 26, 2011 at 5:53 am #

Trackbacks/Pingbacks (37)

  1. FDT» Blog Archive » Helpful Links on Monday, January 10, 2011 at 6:24 pm

    [...] Which then leads to Thibault Imbert’s post “Digging More Into Molehill APIs“. [...]

  2. Script.it » Archive » [Lab] 3D study #01 on Wednesday, January 12, 2011 at 3:33 pm

    [...] I hope they love me as much as i love them! Helpfull links to get more insight: – Iñigo Quílez – Thibault Imbert on MoleHill – Rob Bateman – Away3D – Papervision3D The [Lab] 3D study #01 by Script.it, unless otherwise [...]

  3. nulldesign // lars gerckens » Flash 3D – Molehill on Thursday, January 13, 2011 at 12:00 am

    [...] you can imagine I’m playing around a lot with the new Molehill API and it’s so stunning and I can’t wait for the final player to be released to the [...]

  4. Digging more into the Molehill APIs by T… « Selim Anaç on Friday, January 14, 2011 at 8:08 pm

    [...] Digging more into the Molehill APIs by Thibault Imbert http://www.bytearray.org/?p=2555 [...]

  5. Thibault Imbert is a Trendsetter on Thursday, January 20, 2011 at 1:39 am

    [...] I forced my sister-in-law to watch a really cool video showing off the capabilities of the upcoming Molehill APIs. Even though she’s not a Flash developer, she gazed at the screen in amazement and I was [...]

  6. Anonymous on Thursday, February 3, 2011 at 11:19 pm

    [...] [...]

  7. arnokohl on Sunday, February 20, 2011 at 6:03 pm

    [...] Imbert (Adobe) über die technischen Hintergründe von Molehill: http://www.bytearray.org/?p=2555 Derselbe mit einer Auflistung von beispielhaften Videos verschiedener Entwickler, die mit einer [...]

  8. [...] Digging more into the Molehill APIs by Thibault Imbert [...]

  9. Adobe Flash Player 11给力发布! | 斯樵工坊 on Monday, February 28, 2011 at 6:53 am

    [...] 源地址: http://www.bytearray.org/?p=2555 [...]

  10. [...] 深入挖掘Molehill API的一些特征(翻译版):http://bbs.9ria.com/thread-71980-1-3.html,原文:http://www.bytearray.org/?p=2555 [...]

  11. [...] 深入挖掘Molehill API的一些特征(翻译版):http://bbs.9ria.com/thread-71980-1-3.html,原文:http://www.bytearray.org/?p=2555 [...]

  12. [...] as possible in the code so you can follow along. I should note that this source code is based on Thibault Imbert’s blog post and this video. [...]

  13. [...] here to learn [...]

  14. [...] Digging more into the Molehill APIs [...]

  15. nulldesign // lars gerckens » Pixel Bender 3D beta available on Thursday, March 3, 2011 at 11:06 am

    [...] talking about and you’re have heard the word pixelshader for the first time, take a look at Thibault’s intro to Molehill and Michael’s Simple 2D Molehill [...]

  16. [...] 元記事はコチラ。http://www.bytearray.org/?p=2555 [...]

  17. [...] If you want to better understand exactly what Molehill is, I suggest reading Thibault Imbert’s recent blog post, Digging more into the Molehill APIs [...]

  18. [...] 深入挖掘Molehill API的一些特征(翻译版):http://bbs.9ria.com/thread-71980-1-3.html,原文:http://www.bytearray.org/?p=2555 [...]

  19. [...] If you want to better understand exactly what Molehill is, I suggest reading Thibault Imbert’s recent blog post, Digging more into the Molehill APIs [...]

  20. Flash in the Can 2011 Writeup - JonnyReeves.co.uk on Thursday, March 10, 2011 at 7:33 pm

    [...] to the triangles which are drawn to the screen.  As the Molehill APIs are very low level, your only tool for writing shaders is Assembley (via AGAL (Adobe Graphics Assembly Language)) which is something most FlashDevs won’t be [...]

  21. [...] signal processing. Using Pixel Bender 3D or AGAL (“Adobe Graphics Assembly Language”) you can write your own shaders to crunch bytes fast. Whether those bytes represent 3D vertices, 2D sprites or audio samples, Flash Player 11 and your [...]

  22. [...] running was among the most eminent problems I had… the featured article “Digging more into the Molehill APIs by Thibault Imbert” on the molehill page is pretty useless. It describes what the next version will be able to [...]

  23. [...] 深入挖掘Molehill API的一些特征(翻译版):http://bbs.9ria.com/thread-71980-1-3.html,原文:http://www.bytearray.org/?p=2555 [...]

  24. [...] 英文版:http://www.bytearray.org/?p=2555 [...]

  25. Hello Molehill !! :D :D « SyntaxScrewer's Blog on Monday, May 9, 2011 at 7:14 am

    [...] Understanding Molehill – Digging Deeper Into Molehill http://www.bytearray.org/?p=2555 Understanding Molehill – Alternativa 3D 8 Migration (Talks about a few underlying concepts of [...]

  26. [...] Stage3D Stage3D là phần đứng giữa Flash và bộ máy xử lý đồ họa (GPU). Stage3D không nằm trong display list (bạn không thể làm thao tác addChild với Stage3D). Stage3D tồn tại song song với đối tượng Stage chúng ta đã biết. Stage3D nằm đằng sau tất cả các đối tượng hiển thị của Flash nhưng nằm đằng trước StageVideo. Minh họa Stage3D (nguồn: ByteArray) [...]

  27. Ein Dreieck in Molehill erstellen | senäh on Monday, May 30, 2011 at 1:08 pm

    [...] auch immer man dies tun sollte ). Außerdem greift Molehill im Falle einer fehlenden GPU auf einen Software Renderer namens SwiftShader zu, so dass immer eine Stage3D-Instanz vorhanden sein müsste. Ist eine GPU vorhanden, werden die [...]

  28. Molehill 3D / 2D and Flaemo – First attempt | Flaemo on Tuesday, June 7, 2011 at 12:57 am

    [...] of all there is plenty resources available over the web you can learn a basics. bytearray.org would be good starting point. There is also well written article by [...]

  29. [...] Digging more into the Molehill APIs by Thilbault Imbert [...]

  30. Introducing Starling - ByteArray.org on Wednesday, September 21, 2011 at 6:20 am

    [...] of you, who had a look at the Stage3D APIs understand that this can be complex sometimes. So instead of writing 60 lines of code to draw a [...]

  31. [...] vs. using shaders and bytecode assemblers (which is awesome or horrible depending): http://www.bytearray.org/?p=2555 [...]

  32. Graphics Engines in Firefox | junglecode.net on Thursday, September 29, 2011 at 1:23 am

    [...] shipping product. Sometimes, you have no choice but to tack it on top of (or in Flash’s case, behind) the existing [...]

  33. [...] 보기 전에 아래 글을 선행적으로 본다면 이해하는데 도움이 될 것이다. Digging more into the Molehill APIs MoleHill Getting [...]

  34. GPU accelerated rendering with Starling « aloft on Thursday, October 20, 2011 at 1:57 am

    [...] been a long time I did anything with actual 3D modeling, the logical place to start is 2D. Dreading the complexity of the Stage3D API’s I will focus on the Starling framework. There is a nice example of what complexity Starling hides [...]

  35. ND2D best practices – howto - nulldesign // lars gerckens on Tuesday, January 31, 2012 at 12:58 am

    [...] behind all flash content. If you want to get a little low level knowledge, read Thibault’s article here. Using the GPU, the flash player is able to render full screen HD content at 60hz… Finally a [...]

  36. AS3 Stage3D: Away3D | put things down on Wednesday, March 7, 2012 at 9:40 am

    [...] Digging more into the Molehill APIs – ByteArray.org [...]

  37. [...] 源地址:Digging more into the Molehill APIs [...]