We closed this forum 18 June 2010. It has served us well since 2005 as the ALPHA forum did before it from 2002 to 2005. New discussions are ongoing at the new URL http://forum.processing.org. You'll need to sign up and get a new user account. We're sorry about that inconvenience, but we think it's better in the long run. The content on this forum will remain online.
Page Index Toggle Pages: 1
Real-time Video Delay (Read 4864 times)
Real-time Video Delay
May 20th, 2006, 7:26pm
 
Hi guys,

I've been looking for hardware solutions that can add a 15 seconds delay to a live video signal but can't find anything.  I'm working on a small installation à la Dan Graham.  Now it seems that there might be a possibility using Processing.  Anyone knows how to do this, or had any experience with this?

Setup:

Webcam or Video cam --> Laptop --> Video out --> UHF transmitter --> Protable television.

Thanks,
jph
Re: Real-time Video Delay
Reply #1 - Jan 28th, 2008, 1:54pm
 
exactly the same problem

can anyone help us to do this ?

many thx
Re: Real-time Video Delay
Reply #2 - Feb 1st, 2008, 11:40pm
 
Here you go.

Note: Depending on the size of the video, you may need to increase the max memory (File -> Preferences) since the entire "buffer" is kept in memory as an array of PImages.  It worked fine for me with a 15 second delay (450 frames @ 30 fps) and a frame size of 320x240 when I had the memory set to 256mb.  The buffer only needs to accommodate the framerate of the *input* so keep that in mind when you're initializing it or the timing will be off.

This sketch uses the "station.mov" movie that is included with the examples when you download processing, you'll need that in the "data" folder to try this out.

You'll need to replace the Movie with a Capture object so it will work with a camera instead of a pre-recorded movie.  The VideoBuffer class doesn't care where the images come from, but they DO need to be the same size.

Also, the output will be black until the buffer is full (for obvious reasons).  You could easily modify the VideoBuffer class to show an "initial" image while the buffer is filling up, but I'll leave that to you since I have no idea about your specific implementation needs.


Code:

import processing.video.*;

VideoBuffer vb;
Movie myMovie;

void setup()
{
size(300,300, P3D);
myMovie = new Movie(this, "station.mov");
vb = new VideoBuffer(30, 160, 120);
myMovie.loop();
}

void movieEvent(Movie m)
{
m.read();
vb.addFrame( m );
}

void draw()
{
image( vb.getFrame(), 100, 100 );
image( myMovie, 0, 0 );
}


class VideoBuffer
{
PImage[] buffer;

int inputFrame = 0;
int outputFrame = 0;
int frameWidth = 0;
int frameHeight = 0;

/*
parameters:

frames - the number of frames in the buffer (fps * duration)
width - the width of the video
height - the height of the video
*/
VideoBuffer( int frames, int width, int height )
{
buffer = new PImage[frames];
for(int i = 0; i < frames; i++)
{
this.buffer[i] = new PImage(width, height);
}
this.inputFrame = frames - 1;
this.outputFrame = 0;
this.frameWidth = width;
this.frameHeight = height;
}

// return the current "playback" frame.
PImage getFrame()
{
return this.buffer[this.outputFrame];
}

// Add a new frame to the buffer.
void addFrame( PImage frame )
{
// copy the new frame into the buffer.
System.arraycopy(frame.pixels, 0, this.buffer[this.inputFrame].pixels, 0, this.frameWidth * this.frameHeight);

// advance the input and output indexes
this.inputFrame++;
this.outputFrame++;

// wrap the values..
if(this.inputFrame >= this.buffer.length)
{
this.inputFrame = 0;
}
if(this.outputFrame >= this.buffer.length)
{
this.outputFrame = 0;
}
}
}
Re: Real-time Video Delay
Reply #3 - Mar 3rd, 2008, 2:53pm
 
hey rrrufusss thx for your code it works perfectly !

I tried to adapt it for a webcam but i didn't manage. I add a Capture objet :

import processing.video.*;
Capture maCam;
VideoBuffer monBuff;

void setup() {
 size(320,240);
 maCam = new Capture(this, width, height, 30);
 monBuff = new VideoBuffer(30, 160, 120);
}

void captureEvent(Capture maCam) {
 maCam.read();
 monBuff.addFrame( maCam );
}

void draw() {
 image( monBuff.getFrame(), 100, 100 );
 image( maCam, 0, 0 );
}


here my displayed error :

quicktime.std.StdQTException[QTJava:6.1.6g],-9405=couldntGetRequiredComponent,QT.vers:7168000
     at quicktime.std.StdQTException.checkError(StdQTException.java:38)


thx for your help !
Re: Real-time Video Delay
Reply #4 - Mar 4th, 2008, 1:54am
 
That exception looks like something is wrong with the camera.. Do the example sketches work with your web cam?

Also, the Capture needs to use the same dimensions as the VideoBuffer.  You have the camera set to use 320x240, but the buffer is using 160x120.  That might also be causing trouble (but I'd expect a different error if that were the case).

So, try the other camera example sketches to make sure your web cam works, and adjust the dimensions so the VideoBuffer matches the Capture object.  Hopefully that will get things working for you.
Re: Real-time Video Delay
Reply #5 - Sep 15th, 2009, 5:06am
 
Here follows code to display a delayed image from captured video. You can change the capture size and the displayed size independently.

Original code by rrrufusss, modified by Henrik G. Sundt.


Code:


import processing.video.*;
Capture maCam;
VideoBuffer monBuff;

int display_xsize = 1440;  // display size
int display_ysize = 900;
int capture_xsize = 320;  // capture size
int capture_ysize = 240;

int delay_time = 30;  // delay in seconds
int capture_frames = 30;  // capture frames per second


void setup() {
size(display_xsize,display_ysize, P3D);
// Warning: VideoBuffer must be initiated BEFORE capture- or movie-events start
monBuff = new VideoBuffer(delay_time*capture_frames, capture_xsize,capture_ysize);
maCam = new Capture(this, capture_xsize, capture_ysize, capture_frames);
}

void captureEvent(Capture maCam) {
maCam.read();
monBuff.addFrame( maCam );
}

void draw() {
 PImage bufimg = monBuff.getFrame();
 PImage tmpimg = createImage(bufimg.width,bufimg.height,RGB);
 tmpimg.copy(bufimg,0,0,bufimg.width,bufimg.height,0,0,bufimg.width,bufimg.height);
 tmpimg.resize(display_xsize,display_ysize);
 image( tmpimg, 0, 0 );
}


class VideoBuffer
{
PImage[] buffer;

int inputFrame = 0;
int outputFrame = 0;
int frameWidth = 0;
int frameHeight = 0;

/*
  parameters:

  frames - the number of frames in the buffer (fps * duration)
  width - the width of the video
  height - the height of the video
*/
VideoBuffer( int frames, int width, int height )
{
  buffer = new PImage[frames];
  for(int i = 0; i < frames; i++)
  {
    this.buffer[i] = new PImage(width, height);
  }
  this.inputFrame = frames - 1;
  this.outputFrame = 0;
  this.frameWidth = width;
  this.frameHeight = height;
}

// return the current "playback" frame.  
PImage getFrame()
{
  int frr;
 
  if(this.outputFrame>=this.buffer.length)
    frr = 0;
  else
    frr = this.outputFrame;
  return this.buffer[frr];
}

// Add a new frame to the buffer.
void addFrame( PImage frame )
{
  // copy the new frame into the buffer.
  System.arraycopy(frame.pixels, 0, this.buffer[this.inputFrame].pixels, 0, this.frameWidth * this.frameHeight);
 
  // advance the input and output indexes
  this.inputFrame++;
  this.outputFrame++;

  // wrap the values..    
  if(this.inputFrame >= this.buffer.length)
  {
    this.inputFrame = 0;
  }
  if(this.outputFrame >= this.buffer.length)
  {
    this.outputFrame = 0;
  }
}  
}


Re: Real-time Video Delay
Reply #6 - Sep 17th, 2009, 8:13am
 
Just a comment about the boundary check in getFrame(), which was added to the original code.

Code:

PImage getFrame()
{
  ...
  if(this.outputFrame>=this.buffer.length)
    frr = 0;
  ...
}


It might seem unnecessary since the boundaries are checked in addFrame():

Code:

void addFrame( PImage frame )
{
  ...

  this.outputFrame++;

  ...

  if(this.outputFrame >= this.buffer.length)
  {
    this.outputFrame = 0;
  }

  ...
}



But it is necessary since events like captureEvent() or draw() can happen at any time, also between the increment and the check.
Page Index Toggle Pages: 1