1
4. Combining Video And Graphics
To obtain a first-person experience of a real car racing, a video camera has been installed on the Mindstorms robot car
Real time video will be sent thru a wireless channel to the video server
Software program should be developed to retrieve the video from the server to display on Ogre
Hence video and graphics are combined to achieve augmented reality
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
References:1. krssagar, “Simultaneous Previewing & Video Capture using DirectShow” http://www.codeproject.com/KB/audio-video/DXCapture.aspx2. Ogre Tutorials, http://www.ogre3d.org/wiki/index.php/Ogre_Tutorials
2
The Video Server To minimize the delay in controlling the car, video information is
sent thru a 2.4GHz analogue channel from the car to the server Properties of analogue video
Advantage: minimum delay Disadvantages: small capacity, i.e. there cannot be too many
video channels Each car uses a different frequency (2.4GHz + ) to send the
video information to the server Due to the limitation of the available frequencies, at most 4 cars
can send video information at the same time Video information is received by a Receiver which is connected
to the server via a USB port User needs to install their program in the server to read the video
information thru the USB port and display on the screen
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
3
The Video Server (cont)
Video Server
USB
2.4GHz analogue channelReceivers
Precaution: since analogue video is used, the received video quality can be affected by many environmental factors, such as Distance between the car and the receiver Electrical appliances that generate a similar frequency
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
4
Video Server
The Webcam Class
USB
2.4GHz analogue channelReceivers
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
A class called Webcam has been pre-built to allow user’s program to retrieve video data
User’s program only needs to add the library Webcam.lib in their project and include the head file Webcam.h
class Webcam{
:}
Webcam.lib
5
The Webcam Class (cont)
The Webcam class provides functions for previewing and grabbing simultaneously from a webcam using DirectShow technology
DirectShow is a middleware architecture that provides a pipeline for media playback and capture
The DirectShow API enables playback of multimedia content from local storage or from streamed sources over a network, or the Internet
To use DirectShow, Microsoft Windows SDK has to be installed
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
6
The Webcam Class (cont)class Webcam{public:
Webcam(void);~Webcam(void);
HRESULT Init(int iDeviceID, HWND hWnd, int iWidth, int iHeight);
DWORD GetFrame(BYTE ** pFrame);DWORD GrabFrame();:
};
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
7
Webcam::Init() Hook up user’s program with, in this project, the
USB port for receiving video data Perform the initialization of the video capture
function Input parameters
iDeviceID is the selected device index. In this project, the number should range from 0 to 3
hWnd is the handle of the display window. In this project, since the video will not display on a Microsoft Window, so we can fill in NULL here
iWidth and iHeight are the size of the video frame. In this project, we set them as 320 and 240, respectively
Return S_FALSE if everything fineDepartment of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
8
Webcam:: GrabFrame()
GrabFrame() will grab a frame of image from the camera and store it in an internal buffer
Input parameter: Nil Return value is the size of the buffer
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
9
Webcam::GetFrame() To obtain the address of the image data buffer Input parameter
pFrame is the address of a memory location for storing the address of the image data buffer
Return the size of the buffer, which should be the same returned by GrabFrame()
BYTE *pImage; // Define pImage is a BYTE pointer // but has not initialized its value
DWORD bufferSize = GetFrame(&pImage);// After calling the function, pImage will be// initialized with the address of the image
// data buffer
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
10
Software Architecture of the User’s Program
Ogre main program1. Create Webcam object2. Create a plane to cover up the whole screen3. Define the material of that plane4. Create the texture object to be used on that plane
processCalculation(){ : 1. Grab 1 video frame to buffer 2. Get the pointer of the buffer 3. Put the data in the buffer on the texture object of the plane :}
Update screen
Finish update screen
:
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
11
Create a Plane
Plane is a primitive in Ogre We can change the size, position, orientation and
the texture of a plane In our previous labs, the ferris wheel is built on a
plane laid horizontally with grass texture In this lab, we are going to create a plane that
covers the whole screen We shall set the texture of the plane that allows us
to put the data in the image buffer on it
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
12
Ogre::Plane()
The Plane class of Ogre defines a plane in the 3D space
There are a few constructors for the Plane class. The most common one is as follows:
Plane plane(Vector3::UNIT_Z, 0); // Define the normal of the// plane is the z-axis and the // distance from the origin is 0
(0,0,0)z
y
x
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
plane
13
createPlane() We need to register the plane so that we can use it
in our project The createPlane() member function of MeshManager takes in a Plane definition and makes a mesh from the parameters
This registers our plane for use, e.g. MeshManager::getSingleton().createPlane("webcamPlane",ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, plane, 320, 240, 20, 20, true, 1, 1, 1, Vector3::UNIT_Y); // A plane called webcamPlane // is registered. The size is 320 x 240. The // plane stands parallel to the y-axisDepartment of
ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
14
Show the Plane on the Screen
Similar to other mesh objects, we show the plane on the screen by first creating an entity of the plane and calling attachObject() and setPosition(), e.g.
Entity *ent = mSceneMgr->createEntity("PlaneEntity", "webcamPlane");SceneNode *node = mSceneMgr->getRootSceneNode()->createChildSceneNode();node->attachObject(ent);// node->setPosition(x??,y??,z??); // Choose an appropriate value for x, y, z such that// the plane will cover the whole screen
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
15
Set the Material When we manually created the plane, we did not
specify what texture to use on it In fact, we want to put the image data received from
the camera to the plane We first define the name of the material used for that
plane as follows:
And we shall define the texture of such material later
Entity *ent = mSceneMgr->createEntity("PlaneEntity", "webcamPlane");ent->setMaterialName("Webcam/MyMaterial");
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
16
Set the Texture of the Material
We create the required texture by instantiating an object of the TextureSystem class
mTextureSystem = new TextureSystem(320, 240);//Create the required texture with size 320x240
Ogre::MaterialPtr mat = Ogre::MaterialManager::getSingleton().getByName("Webcam/MyMaterial");
//Get the pointer of a material with name// “Webcam/MyMaterial” – the same used by the plane
Ogre::TextureUnitState * tex = mat->getTechnique(0)->getPass(0)->getTextureUnitState(0);
//Get the pointer of the texture unit of that materialtex->setTextureName(mTextureSystem
->GetTexture()->getName());//Set the name of that texture unit the same as that// created by TextureSystem, i.e. the plane that uses
// the material “Webcam/MyMaterial” will have the // texture the same as that created by TextureSystem
17
TextureSystem Class
A class called TextureSystem is introduced in this lab to handle the creation and update of the texture of the material Webcam/MyMaterial
Four major public member functions: TextureSystem() – the constructor creates the texture GetTexture() – Get the pointer of the created texture CleanTextureContents() – Reset all pixels of the
texture to 0 (paint it all black) UpdateTexture() – Copy a frame of video data to the
texture
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
18
TextureSystem Class (cont)class TextureSystem{public:
TextureSystem(int width, int height);~TextureSystem(void);Ogre::TexturePtr GetTexture(); /// Obtain the pointer of the ogre texture void UpdateTexture(BYTE *pBmpTmp); /// Copy a frame of video data to the texture void CleanTextureContents(); /// Clean the full texture (paint it all
black)protected:
Ogre::TexturePtr mTexture; /// Texture for rendering the video dataOgre::Real mTexWidth; /// Real texture widthOgre::Real mTexHeight; /// Real texture height
};
19
TextureSystem:: TextureSystem()
TextureSystem::TextureSystem(int width, int height){
mTexWidth=width;mTexHeight=height;
// Create the texture we are going to usemTexture=Ogre::TextureManager::getSingleton().
createManual("WebcamManualTexture", // nameOgre::ResourceGroupManager::
DEFAULT_RESOURCE_GROUP_NAME,Ogre::TEX_TYPE_2D, // texture typemTexWidth,mTexHeight,0, // number of mipmapsOgre::PF_BYTE_BGRA, // pixel formatOgre::TU_DYNAMIC_WRITE_ONLY_DISCARDABLE);
}
20
TextureSystem:: GetTexture()
Just return the pointer of the created texture, i.e. mTexture member variable
Ogre::TexturePtr TextureSystem::GetTexture(){
return mTexture;}
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
21
TextureSystem:: CleanTextureContents()
void TextureSystem::CleanTextureContents(){ unsigned int idx;
int x, y;// lock the pixel buffer and get a pixel box
:for (x=0, y=0; y<texh; ){
idx=(x*4)+y*texw*4;pDest[idx]=0; //bluepDest[idx+1]=0; //greenpDest[idx+2]=0; //redpDest[idx+3]=255; //alpha (255 -> opaque)x++;if (x>=texw) {
x=0; y++;}
}// Unlock the pixel buffer
:}
22
TextureSystem:: CleanTextureContents()
:
..
b g r a
Pixel 0
b g r a
Pixel 1
b g r a
Pixel 319
b g r aPixel 320
b g r aPixel 239*320
pDest
Department of ELECTRONIC AND INFORMATION ENGINEERING
4. Combining Video and Graphics by Dr Daniel Lun
23
TextureSystem:: UpdateTexture()
void TextureSystem::UpdateTexture(BYTE *pBmpTmp){ unsigned int idx;
int x, y;// lock the pixel buffer and get a pixel box
:
// Input parameter pBmpTmp is a BYTE pointer which// gives the address of the buffer where the frame// of video data is stored
// Get the data BYTE by BYTE from *pBmpTmp and put// them to the pixel buffer of the Texture
::
// Unlock the pixel buffer:
}
24
Tasks to be Achieved
In this lab, you are asked to Complete the routine to create the plane, set
material and texture Complete the implementation of the member
function UpdateTexture() Show the video in the Ogre environment Drive the car based on the live video received
from the wireless camera installed on the car