From Gnash Project Wiki

Jump to: navigation, search

This page describes the current effort underway to stabilize Audio/Video streaming in Gnash.

Caution: this proposal is written based upon certain assumptions which will have to be verified.

1. Remove ad hoc ffmpeg support

Currently, Gnash implements audio/video decoding through either gstreamer or ffmpeg. Under the hood, if you use gstreamer it will use ffmpeg for audio and video decoding anyway. Gstreamer adds the advantage of having a simple, well defined and well documented abstraction mechanism. The use of ffmpeg directly adds no features that I know of.

It is therefore proposed that ffmpeg support is removed.

2. Implement URL-based A/V playback using gstreamer playbin

Playbin is a gstreamer module which can download and demux audio and video from a URL. It will start a background thread to do the downloading. A "sink" can be connected to the Playbin module so that we can feed video frames to the renderer. Playbin has the major advantage that it abstracts much of the threading issues; we'll no longer have to use a "LoadThread" for downloading; I believe that the need for a decoding thread would also be removed.

An additional advantage is that Playbin will automatically detect the format of the URL provided. It will parse and play Ogg Theora streams, but also FLV video -- negating the need for the FLV decoder which is currently implemented in Gnash itself.

  • It is important to note that there is no RTMP support in either gstreamer or ffmpeg. However, the current RTMP implementation is reportly not functional, so we may consider using Playbin until RTMP reaches maturity, and when that happens, switch to the more fine-grained decodebin.
  • Another important thing is that gstreamer doesn't support extracting 0x12 (META) messages (AMF) from an FLV, so can't be used completely. See for more info. However, in its current incarnation, Gnash's NetStream does not utilize Meta messages, in the sense that we have not provided a hook for the OnMetaData actionscript event.
  • Other thing is we want to control all network accesses in a central place (both for security and caching and whatever..)
  • gst_ffmpeg, used for FLV demuxing and decoding, is unable to demux media loaded from HTTP (and other streaming types), because it doesn't implement Gstreamer's "push" mode. However, gstreamer CVS contains a fully functional FLV demuxer which does handle streaming.

3. Implement SWF-embedded video using gstreamer decodebin

Since video embedded directly into SWF can't be played using playbin, we'll use decodebin. Decodebin is a lower level gstreamer module which will allow us to feed it data through a buffer.

Since I don't yet know exactly how SWF embeds video, this section will have to be updated later.