SlideShare uma empresa Scribd logo
1 de 15
Baixar para ler offline
CS 362: Graphics Scribe
Hidden Surface Removal
By:
Rajkumar Singh (09010138)
Akhilesh Shah (09010147)
Course Instructors:
Professor Pinaki Mitra
Professor Samit Bhattacharya
Department of Computer Science and Engineering
Indian Institute of Technology, Guwahati
23rd April, 2012
1
Abstract
In Computer Graphics, Hidden surface determination also known as Visible Surface
determination or hidden surface removal is the process used to determine which surfaces
of a perticular object are not visible from a perticular angle or perticular viewpoint.
In this scribe we will describe the object-space method and image space method. We
will also discuss Algorithm based on Z-buffer method, A-buffer method, and Scan-Line
Method.
Keywords: A buufer, Z-buffer, depth buffer, Refresh buffer, Hidden Surface Re-
moval(HSR).
1 Hidden Surface removal
Visible Surface detection or Hidden surface removal algorithms are mainly classified
according to whether they deal with object definations directly or the images that are
obtained after projections. Broadly there are two types of methods shown as follows.
1.1 Object-Space Method
In object-space method we took whole object under consideration. This method com-
pares objects and parts of objects to each other within the scene defination to determine
that which surfaces, as a whole are not visible from a certain viewpoint and so that can
mark the surfaces as visible which are visible.
1.2 Image-Space Method
In Image-Space algorithm, visibility is decided by point by point at each pixel position
on the projection plane. Here we consider the projected image of the object not the
object itself like object-space method. Visibility of unvisibility is decided by pixel by
pixel or point by point traversal.
Note: Most visible surface/ Hidden Surface removal algorithms use image-space
methods, although object-space methods can also be used effectively to locate visible
surfaces or to removing hidden surfaces. Eg. Line Diplaying algorithm uses object-space
methods for displaying lines in a wire-frame.
2 Hidden Surface Removal Algorithms
Algorithms are describes below.
2
2.1 Back-Face Detection
A fast and simple object-space method for identifying the back face of poly-hedron is
based on the Inside-Outside (given a point is inside or outside of a plane/surface). Let
Plane Parameters are A, B, C, D then a point (x, y, z) will be inside to a plane or on
the plane if
Ax + By + Cz + D ≤ 0
When an inside point is along the line of sight to the surface the polygon must be a
back face (it means we are inside that face and can not see its front from our viewing
position).
We can simplify the test by taking a vector let say N which is normal to a polygon
surface. Let the components of N are (A, B, C). V is a vector in the viewing direction
from the camera or eye position. As shown in Figure 1. The polygon will be back face
if dot product of its normal vector and viewing vector is greater than zero.means
V · N ≥ 0
If the object descriptions have been converted to projection coordinates and our viewing
direction is parallal direction is parallel to the viewing Zv axis, then V = (0, 0, Vz) and
V · N = VzC
So that we only need to consider the sign of C, the z component of the normal vector
N.
Figure 1: Polyhedron
In a Right handed viewing system with viewing direction along the negative zv axis,
the polygon is a back face C < 0. Also viewer can not see any face whose normal has
z component C = 0, since the viewing direction in the given example is grazing that
polygon. This in general, we can label any face if its normal vector has a z-component
value C ≤ 0. Some Observation based on the z-componet value.
C < 0 Back face of the polyhedron
C = 0 Grazing the polygon, Not visible from Viewing position
C > 0 Face Visible from viewing position.
Similar methods can be used for Left handed viewing position systems. In Left
3
Handed Viewing systems back faces are identified by C ≥ 0 when the viewing dorec-
tion is along the positive zv axis.
By examining parameter C for the different planes defining an object, all the back
faces can be identified. For a single convex polyhedron this test identifies all the hidden
surfaces because each surface in polyhedron is either completely visible or completely
hidden. And also if a scene contains only nonoverlapping convex polyhedra, then again
all hidden surfaces are identified with the back method.
For concave polyhedron more tests need to be carried out to determine whether
there are additional faces that are totally or partially obscured by other faces.
2.2 Depth-Buffer Method
This approach is based on image-Space method.In this method For detecting visible
surfaces we compares surface depth at each pixel position on the projection plane. This
procedure is also called as Z-buffer method, Since object depth is measured from the
view plane along the z axisof a viewing system. Here Each surface of the scene is pro-
cessed separately, one point time across the surface. The method is usually applied to
scenes containing only polygon surfaces, because depth values can be computed very
quickly and the method is easy to implement. But the method can be applied to non-
planar surfaces as well. Below is the figure in which comparison between actual scene
and z-buffer scene is shown.
Figure 2: Comparison between Z-buffer and Actual Scene
With the object descriptions converted to projection coordinates, each (x, y, z) position
on a polygon surface corresponds to the orthographic projection point (x, y) on the view
plane. Therefore, for each pixel position (x, y) on the view plane, object depths can be
compared by comparing z values. Figure 3 shows the comparison between depths. this
figure shows surfaces at varying distance along the orthographic projection line from
postion (x, y) in a view plane taken as the xvyv plane. Surface S1 is the closest at this
position, so its surface intensity at (x, y) is saved.
We can implement the depth-buffer algorithm in normalized coordinates, so that z val-
ues range from 0 at the black clipping plane to zmax at the front clipping plane. The
plane of zmax can be set either to 1 (for unit cube) or to the largest value that can be
4
stored on the system.
As implied by the name of this method, two buffer areas are required. A depth buffer
is used to store depth values for each (x, y) position as surfaces are processed, and the
refresh buffer stores the intensity values for each position.Initially, all positions in the
depth buffer are set to 0 (minimum depth), and the refresh buffer is initialized to the
background intensity. Each surface listed in the polygon tables is then processed, one
scan line at a time, calculating the depth (z depth) at each (x, y) pixel. The calculated
depth is then compared with the value stored in the depth buffer. If it is greater than
the value in depth buffer then this new calculated value is stored in the depth buffer.
And the surfce intensity is calculated at the same position and stored in the same lo-
cation in the (x, y) in the refresh buffer.
Figure 3: Polygons at different depth
In short whatever algorithm we discussed above is discussed below in small points.
1. Initialize the Depth buffer and refresh buffer with minus inifinty and background
intensity respectively for all buffer positions (x, y).
depth(x, y) = −∞ refresh(x, y) = Ibackground
2. For each position on each polygon surface, compare depth values to previouslystored
valuesin the depth buffer to determine visibility.
• Calculate the depth z for each position (x, y) on the polygon.
• if z > depth(x, y), then set
depth(x, y) = z, refresh(x, y) = Isurface(x, y)
where Ibackground is the value for the background intensity, and Isurface(x, y) is the
projectedd intensity value for the surface at pixel position (x, y).
5
Equation of plane is:
Ax + By + Cz + D = 0 (1)
Depth values for a surface position (x, y) are calculated from the plane equation for each
surface:
z =
−Ax − By − D
C
(2)
For any scan line, adjacent hrizontal positions across the line differ by 1, and a vertical
y value on an adjacent scan line differs by 1. if the depth of position (x, y) has been
determined to be z, then the depth z of the next position (x + 1, y) along the scan line is
obtained from given equation.
z =
−A(x + 1) − By − D
C
(3)
⇒ z = z −
−A
C
(4)
Figure 4: Depth at adjacent position
The Ratio −A
C is constant for each surface, so succeding depth values across a scan line are
obtained from preceding values with a single addition.
We first determine the y-coordinate the y-coordinate extends of each polygon, and process
the surface from tha topmost scan line. Starting at a top vertex, we can recrsively calculate
x position down a left edge of the polygon as x = x − 1
m , where m is the slope of the edge
in the figure shown below.
Figure 5: scan line and a polygon
6
Depth values of the edge are obtained recursively as
z = Ax −B(y−1)−D
C
x = y
m − b
m
y − 1 = mx + b
x = y−b−1
m
x = x − 1
m
⇒ z = −(Ax+By+D)
C +
A
m
+B
C
⇒ z = z +
A
m + B
C
(5)
For Vertical left edge since m → ∞ hence we calculate z using given equation
z = z +
B
C
(6)
Note: This Depth-buffer alias Z-buffer method works only for Opague surfaces, but for
Transparent surfaces we have to store the vertical scan line as well as modifying z-buffer
method and hence this method(modificationof z-buffer method) is known as A-buffer method.
2.3 A-Buffer Method
The A-Buffer method represents an anti-aliased, area-averaged, accumulation buffer method.
It was developed for implementation in the surface-rendering system called REYES (”Ren-
ders Everything You Ever Saw”). A drawback of Depth-buffer method is that it deals only
with opaque surfaces and cannot accumulate intensity values for more than one surface as
required for displaying transparent surfaces. In A-buffer method, each position in buffer
can reference a linked list of surfaces. Thus, multiple surface intensity can be considered
at each pixel position and object edges can be anti-aliased. Each position in A-buffer has
two fields:
• Depth field - stores positive or negative real number.
• Intensity field - stores surface intensity information or a pointer value.
Figure 6: surface overlap
7
Positive depth field signifies that the number stored at that position is the depth of single
surface overlapping the corresponding pixel area. The intensity field stores the RGB com-
ponent of surface colour at that point and the percent of pixel coverage. Negative depth
field signifies multiple surface contribution to the intensity. The intensity field stores a
pointer to linked list of surface data.
Figure 7: Organization of A-buffer pixel position
Data for each surface in the linked list includes:
• RGB intensity component
• opacity parameter (percent of transparency)
• depth
• percent of area covered
• surface identifier
• other surface rendering parameters
• pointer to next surface
Scan lines are processed to determine surface overlaps of pixel across the individual scan
lines. Surfaces are subdivided into polygon mesh and clipped against the pixel boundaries.
Opacity factors and percent of surface overlaps can be used to calculate the intensity of
pixel as an average of the contributions from the overlapping surfaces.
2.4 Scan Line Method
This is image-space based method.This is used for removing hidden surface and this is an
extension of the scan-line algorithm for filling polygon interiors and here in this algorithm
we deal with multiple surfaces, not just a single surface. To determine the visible surfaces,
all polygon surfaces intersecting the scan line are examined. Across each scan line, surface
nearest to the view plane is determined by making depth calculations. After that, intensity
8
value for that position is entered into refresh buffer. Edge Table and Polygon Table are set
up for various surfaces.
Edge table contains:-
• The x-coordinate of the end with the smaller y-coordinate.
• The y-coordinate of the edges other end.
• The x-increment .
• The polygon identification number indicating the polygon to which the edge belongs.
Figure 8: Edge Table Entry
Polygon table contains:-
• The coefficients of the plane equation.
• Shading or colour information for the polygon.
• An in-out boolean flag initialized to FALSE and used during the scan line processing.
Figure 9: Polygon Table Entry
In the below shown figure it is shown that how we actually locate the visible portions of
surfaces for pixel position along the line. The active list for scan line 1 contains information
from the edge table for edges AB, BD, EH and FG. For positions along this scan line between
edges AB and BD, only the flag for surface S2 is on. Therefore, no depth calculation is
required, and intensity information for surfaces S2.
Figure 10: Scan Line Method for Hidden Surface Removal
9
For the given example we will show the Active edge table entries. The example is as follows.
Figure 11: Scan Line Method for Hidden Surface Removal
Scan Line Entries
a AB AC [In out flag of ABCisonBetweenABandAC
b ABAC
ABCon
DEEF
DEFon
c ABD
ABCon
EBC
ABC, DEFon
EF
DEFon
c+1 ABD
ABCon
EBC
ABC, DEFon
EF
DEFon
c+2 ABBC
ABCon
DEEF
DEFon
Table 1: Active Edge Table for Scan line Method in Aove Example
In the above example Between Edge DF and BC since Flags for both ABC and DEF
are On, So we do not have to perform the depth calculation for these two.
The above described algorithms(Scan line method does not work for the scenes in which
there are cyclically overlapping of surfaces. One of such case is shown below.
Figure 12: Cyclically Overlapping Surface
10
2.5 Depth-Sorting Method or Painter’s Algorithm
This method uses both Image-space method and objects-space method. This method for
solving the hidden-surface removal is also known as Painter’s Algorithm because of the
similarity between the Painting creation and this algorithm execution. In this Algorithm
we perform these given basic tasks.
1. All the surfaces are sorted in the order of decresing depths. This sorting is carried
out in both image-space and Object-space. For this we use the deepest point on each
surface for comparison.
2. Surfaces are scan converted in order, First we start with the surface of largest depth.
This scan conversion of the polygon surfaces is performed in image-space or the pro-
jected space.
By refering the figure shown below. We perform certain tests that are as follows for each
surface that overlaps with S (Shown in figure).
Figure 13: Cyclically Overlapping Surface
On performing following tests. if any of starting three tests is true no reasoning is necessary
for that surface. The tests are listed in order of increasing difficulty.
Test 1: The bounding rectangle of the two surfaces on xy−plane do not overlap,
Test 2: Surface S us Completely behind the overlapping surface relative to the viewing
position.
11
In the above shown figure it can be checked by the sign of the endpoints.
Test 3: The overlapping surface is completely in front of S relative to the viewing positions.
Test 4: The projection of the two surfaces on the view plane do not overlap.
3 Quetions
Question 1 In what case is scan line algorithm not applicable?
Scan line Algorithm is not applicable in case of Intersecting surfaces or the cases of cyclic
overlap as shown in figure below.
Figure 14: Cyclically Overlapping Surface
Question 2 Give an example of back face detection alogrithm.
Consider an example of a cube as shown in figure below. n and V are the unit normal
vector to any surface and unit vector in viewing direction respectively. In this example,
12
V = -k, nA = k, nC = -k
Figure 15: Example for Back Face Detection Algorithm
If dot product of n and V is greater than or equal to zero, the surface is a back face.
(n ·V) ≥ 0
In given example,
nA · V = −1,nC · V = 1,nx · V = 0,for surfaces x = B, D, EandF. Hence, in the con-
sidered example, only surface B is visible.
Question 3 Give an example of Z-buffer algorithm (show the calculated depth and color).
Consider the figure as shown below. Surface A (Red coloured) overlaps surface B (Blue
coloured). Consider that surface A is at depth 3 and surface B at depth 5. Now, we prepare
a table containing depth and intensity of any point for a surface determined by polygon
EFGH.
Figure 16: Example for Z-buffer Algorithm
13
Figure 17: Table showing Depth and Intensity of each point
Question 4 Give an example of A buffer Algorithm.
Consider the figure as shown below. Transparent surface A (Red coloured) overlaps trans-
parent surface B (Blue coloured) and surface C (Green coloured). Consider that surface A
is at depth 3, surface B at depth 5 and surface C at depth 7. Now, we determine intensity
at various points of considered polygon EFGH.
Figure 18: Example for Z-buffer Algorithm
For point A, +ve real number is stored in depth buffer field to signify one surface
overlap. Intensity stored in intensity buffer field. However, for point G that has multiple
surface overlaps, -ve real number is stored in depth field. The intensity field points to next
node of the linked list that stores depth value and intensity of Surface A. Next pointer is
to surface B and so forth.
14
Figure 19: Example for A-buffer Algorithm
Question 5 Explain Depth sorting method.
• Both image and object space operations.
• Perform the basic functions.
• Surface are sorted in order of decreasing depth.
• Surface are scan converted in order, starting with the surface of greatest depth.
• Often referred as painters algorithm.
• Test listed in the order of increasing difficulty.
• Surfaces do not overlap.
• Projections of the two surfaces on to the view plane do nor overlap.
15

Mais conteúdo relacionado

Mais procurados

Projection In Computer Graphics
Projection In Computer GraphicsProjection In Computer Graphics
Projection In Computer GraphicsSanu Philip
 
Concept of basic illumination model
Concept of basic illumination modelConcept of basic illumination model
Concept of basic illumination modelAnkit Garg
 
Computer Graphics: Visible surface detection methods
Computer Graphics: Visible surface detection methodsComputer Graphics: Visible surface detection methods
Computer Graphics: Visible surface detection methodsJoseph Charles
 
Back face detection
Back face detectionBack face detection
Back face detectionPooja Dixit
 
UNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdf
UNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdfUNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdf
UNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdfSayantanMajhi2
 
3d transformation computer graphics
3d transformation computer graphics 3d transformation computer graphics
3d transformation computer graphics University of Potsdam
 
3 d transformation
3 d transformation3 d transformation
3 d transformationMani Kanth
 
COMPUTER GRAPHICS-"Projection"
COMPUTER GRAPHICS-"Projection"COMPUTER GRAPHICS-"Projection"
COMPUTER GRAPHICS-"Projection"Ankit Surti
 
Computer graphics curves and surfaces (1)
Computer graphics curves and surfaces (1)Computer graphics curves and surfaces (1)
Computer graphics curves and surfaces (1)RohitK71
 
Curve and text clipping
Curve and text clippingCurve and text clipping
Curve and text clippingArvind Kumar
 
Computer Graphics - Hidden Line Removal Algorithm
Computer Graphics - Hidden Line Removal AlgorithmComputer Graphics - Hidden Line Removal Algorithm
Computer Graphics - Hidden Line Removal AlgorithmJyotiraman De
 
Parallel projection
Parallel projectionParallel projection
Parallel projectionPrince Shahu
 
Visible Surface Detection
Visible Surface DetectionVisible Surface Detection
Visible Surface DetectionAmitBiswas99
 
Hidden surface removal
Hidden surface removalHidden surface removal
Hidden surface removalPunyajoy Saha
 

Mais procurados (20)

Projection In Computer Graphics
Projection In Computer GraphicsProjection In Computer Graphics
Projection In Computer Graphics
 
Concept of basic illumination model
Concept of basic illumination modelConcept of basic illumination model
Concept of basic illumination model
 
Computer Graphics: Visible surface detection methods
Computer Graphics: Visible surface detection methodsComputer Graphics: Visible surface detection methods
Computer Graphics: Visible surface detection methods
 
Back face detection
Back face detectionBack face detection
Back face detection
 
UNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdf
UNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdfUNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdf
UNIT-6-Illumination-Models-and-Surface-Rendering-Methods.pdf
 
3 d display-methods
3 d display-methods3 d display-methods
3 d display-methods
 
3d transformation computer graphics
3d transformation computer graphics 3d transformation computer graphics
3d transformation computer graphics
 
3 d transformation
3 d transformation3 d transformation
3 d transformation
 
visible surface detection
visible surface detectionvisible surface detection
visible surface detection
 
COMPUTER GRAPHICS-"Projection"
COMPUTER GRAPHICS-"Projection"COMPUTER GRAPHICS-"Projection"
COMPUTER GRAPHICS-"Projection"
 
Computer graphics curves and surfaces (1)
Computer graphics curves and surfaces (1)Computer graphics curves and surfaces (1)
Computer graphics curves and surfaces (1)
 
Curve and text clipping
Curve and text clippingCurve and text clipping
Curve and text clipping
 
3D Transformation
3D Transformation 3D Transformation
3D Transformation
 
Computer Graphics - Hidden Line Removal Algorithm
Computer Graphics - Hidden Line Removal AlgorithmComputer Graphics - Hidden Line Removal Algorithm
Computer Graphics - Hidden Line Removal Algorithm
 
Depth Buffer Method
Depth Buffer MethodDepth Buffer Method
Depth Buffer Method
 
Oblique parallel projection
Oblique parallel projectionOblique parallel projection
Oblique parallel projection
 
Parallel projection
Parallel projectionParallel projection
Parallel projection
 
Clipping
ClippingClipping
Clipping
 
Visible Surface Detection
Visible Surface DetectionVisible Surface Detection
Visible Surface Detection
 
Hidden surface removal
Hidden surface removalHidden surface removal
Hidden surface removal
 

Destaque

hidden surface elimination using z buffer algorithm
hidden surface elimination using z buffer algorithmhidden surface elimination using z buffer algorithm
hidden surface elimination using z buffer algorithmrajivagarwal23dei
 
Visible surface detection in computer graphic
Visible surface detection in computer graphicVisible surface detection in computer graphic
Visible surface detection in computer graphicanku2266
 
Hidden lines & surfaces
Hidden lines & surfacesHidden lines & surfaces
Hidden lines & surfacesAnkur Kumar
 
Color Models Computer Graphics
Color Models Computer GraphicsColor Models Computer Graphics
Color Models Computer Graphicsdhruv141293
 
Colour models
Colour modelsColour models
Colour modelsBCET
 
Projection In Computer Graphics
Projection In Computer GraphicsProjection In Computer Graphics
Projection In Computer GraphicsSanu Philip
 
Illumination model
Illumination modelIllumination model
Illumination modelAnkur Kumar
 
Computer Animation PowerPoint
Computer Animation PowerPointComputer Animation PowerPoint
Computer Animation PowerPointoacore2
 
Animation presentation
Animation presentation Animation presentation
Animation presentation BongoBen
 
Bresenham circles and polygons derication
Bresenham circles and polygons dericationBresenham circles and polygons derication
Bresenham circles and polygons dericationKumar
 
Halftone QR Codes
Halftone QR CodesHalftone QR Codes
Halftone QR Codes小蜜 許
 
Lighting and shading
Lighting and shadingLighting and shading
Lighting and shadingeshveeen
 
Lecture filling algorithms
Lecture  filling algorithmsLecture  filling algorithms
Lecture filling algorithmsavelraj
 
Intro to scan conversion
Intro to scan conversionIntro to scan conversion
Intro to scan conversionMohd Arif
 
Computer Graphics
Computer GraphicsComputer Graphics
Computer GraphicsAdri Jovin
 

Destaque (20)

hidden surface elimination using z buffer algorithm
hidden surface elimination using z buffer algorithmhidden surface elimination using z buffer algorithm
hidden surface elimination using z buffer algorithm
 
Visible surface detection in computer graphic
Visible surface detection in computer graphicVisible surface detection in computer graphic
Visible surface detection in computer graphic
 
Hidden Surfaces
Hidden SurfacesHidden Surfaces
Hidden Surfaces
 
Hidden lines & surfaces
Hidden lines & surfacesHidden lines & surfaces
Hidden lines & surfaces
 
Color Models
Color ModelsColor Models
Color Models
 
Color models
Color modelsColor models
Color models
 
Color Models Computer Graphics
Color Models Computer GraphicsColor Models Computer Graphics
Color Models Computer Graphics
 
Colour models
Colour modelsColour models
Colour models
 
Projection In Computer Graphics
Projection In Computer GraphicsProjection In Computer Graphics
Projection In Computer Graphics
 
Illumination model
Illumination modelIllumination model
Illumination model
 
Computer Animation PowerPoint
Computer Animation PowerPointComputer Animation PowerPoint
Computer Animation PowerPoint
 
geometry guide
geometry guidegeometry guide
geometry guide
 
Animation presentation
Animation presentation Animation presentation
Animation presentation
 
Bresenham circles and polygons derication
Bresenham circles and polygons dericationBresenham circles and polygons derication
Bresenham circles and polygons derication
 
Halftone QR Codes
Halftone QR CodesHalftone QR Codes
Halftone QR Codes
 
Lighting and shading
Lighting and shadingLighting and shading
Lighting and shading
 
Lecture filling algorithms
Lecture  filling algorithmsLecture  filling algorithms
Lecture filling algorithms
 
Color models
Color modelsColor models
Color models
 
Intro to scan conversion
Intro to scan conversionIntro to scan conversion
Intro to scan conversion
 
Computer Graphics
Computer GraphicsComputer Graphics
Computer Graphics
 

Semelhante a Hidden Surface Removal using Z-buffer

Computer graphics iv unit
Computer graphics iv unitComputer graphics iv unit
Computer graphics iv unitaravindangc
 
7-Surface Detection Methods.ppt
7-Surface Detection Methods.ppt7-Surface Detection Methods.ppt
7-Surface Detection Methods.pptMarkHark1
 
CG OpenGL surface detection+illumination+rendering models-course 9
CG OpenGL surface detection+illumination+rendering models-course 9CG OpenGL surface detection+illumination+rendering models-course 9
CG OpenGL surface detection+illumination+rendering models-course 9fungfung Chen
 
Visible surface identification
Visible surface identificationVisible surface identification
Visible surface identificationPooja Dixit
 
Unit 3 visual realism
Unit 3 visual realismUnit 3 visual realism
Unit 3 visual realismJavith Saleem
 
visual realism Unit iii
 visual realism Unit iii visual realism Unit iii
visual realism Unit iiiArun Prakash
 
rural marketing ppt
rural marketing pptrural marketing ppt
rural marketing pptelaya1984
 
Visual surface detection i
Visual surface detection   iVisual surface detection   i
Visual surface detection ielaya1984
 
TAO Fayan_X-Ray and MIP volume rendering
TAO Fayan_X-Ray and MIP volume renderingTAO Fayan_X-Ray and MIP volume rendering
TAO Fayan_X-Ray and MIP volume renderingFayan TAO
 
3D Graphics : Computer Graphics Fundamentals
3D Graphics : Computer Graphics Fundamentals3D Graphics : Computer Graphics Fundamentals
3D Graphics : Computer Graphics FundamentalsMuhammed Afsal Villan
 
Newton cotes integration method
Newton cotes integration  methodNewton cotes integration  method
Newton cotes integration methodshashikant pabari
 
miccai-poster-Bahram-Marami
miccai-poster-Bahram-Maramimiccai-poster-Bahram-Marami
miccai-poster-Bahram-MaramiBahram Marami
 
3 d display-methods-in-computer-graphics(For DIU)
3 d display-methods-in-computer-graphics(For DIU)3 d display-methods-in-computer-graphics(For DIU)
3 d display-methods-in-computer-graphics(For DIU)Rajon rdx
 
B. SC CSIT Computer Graphics Unit 4 By Tekendra Nath Yogi
B. SC CSIT Computer Graphics Unit 4 By Tekendra Nath YogiB. SC CSIT Computer Graphics Unit 4 By Tekendra Nath Yogi
B. SC CSIT Computer Graphics Unit 4 By Tekendra Nath YogiTekendra Nath Yogi
 
3 d graphics with opengl part 2
3 d graphics with opengl  part 23 d graphics with opengl  part 2
3 d graphics with opengl part 2Sardar Alam
 
The Day You Finally Use Algebra: A 3D Math Primer
The Day You Finally Use Algebra: A 3D Math PrimerThe Day You Finally Use Algebra: A 3D Math Primer
The Day You Finally Use Algebra: A 3D Math PrimerJanie Clayton
 

Semelhante a Hidden Surface Removal using Z-buffer (20)

unit 4.pptx
unit 4.pptxunit 4.pptx
unit 4.pptx
 
Computer graphics iv unit
Computer graphics iv unitComputer graphics iv unit
Computer graphics iv unit
 
7-Surface Detection Methods.ppt
7-Surface Detection Methods.ppt7-Surface Detection Methods.ppt
7-Surface Detection Methods.ppt
 
UNIT-V
UNIT-VUNIT-V
UNIT-V
 
CG OpenGL surface detection+illumination+rendering models-course 9
CG OpenGL surface detection+illumination+rendering models-course 9CG OpenGL surface detection+illumination+rendering models-course 9
CG OpenGL surface detection+illumination+rendering models-course 9
 
Visible surface identification
Visible surface identificationVisible surface identification
Visible surface identification
 
Unit 3 visual realism
Unit 3 visual realismUnit 3 visual realism
Unit 3 visual realism
 
visual realism Unit iii
 visual realism Unit iii visual realism Unit iii
visual realism Unit iii
 
rural marketing ppt
rural marketing pptrural marketing ppt
rural marketing ppt
 
Visual surface detection i
Visual surface detection   iVisual surface detection   i
Visual surface detection i
 
TAO Fayan_X-Ray and MIP volume rendering
TAO Fayan_X-Ray and MIP volume renderingTAO Fayan_X-Ray and MIP volume rendering
TAO Fayan_X-Ray and MIP volume rendering
 
3D Graphics : Computer Graphics Fundamentals
3D Graphics : Computer Graphics Fundamentals3D Graphics : Computer Graphics Fundamentals
3D Graphics : Computer Graphics Fundamentals
 
posterfinal
posterfinalposterfinal
posterfinal
 
Newton cotes integration method
Newton cotes integration  methodNewton cotes integration  method
Newton cotes integration method
 
miccai-poster-Bahram-Marami
miccai-poster-Bahram-Maramimiccai-poster-Bahram-Marami
miccai-poster-Bahram-Marami
 
3 d display-methods-in-computer-graphics(For DIU)
3 d display-methods-in-computer-graphics(For DIU)3 d display-methods-in-computer-graphics(For DIU)
3 d display-methods-in-computer-graphics(For DIU)
 
B. SC CSIT Computer Graphics Unit 4 By Tekendra Nath Yogi
B. SC CSIT Computer Graphics Unit 4 By Tekendra Nath YogiB. SC CSIT Computer Graphics Unit 4 By Tekendra Nath Yogi
B. SC CSIT Computer Graphics Unit 4 By Tekendra Nath Yogi
 
Shading
ShadingShading
Shading
 
3 d graphics with opengl part 2
3 d graphics with opengl  part 23 d graphics with opengl  part 2
3 d graphics with opengl part 2
 
The Day You Finally Use Algebra: A 3D Math Primer
The Day You Finally Use Algebra: A 3D Math PrimerThe Day You Finally Use Algebra: A 3D Math Primer
The Day You Finally Use Algebra: A 3D Math Primer
 

Mais de Raj Sikarwar

Rabindranath tagore Biography
Rabindranath tagore BiographyRabindranath tagore Biography
Rabindranath tagore BiographyRaj Sikarwar
 
Overlapping community Detection Using Bayesian NMF
Overlapping community Detection Using Bayesian NMFOverlapping community Detection Using Bayesian NMF
Overlapping community Detection Using Bayesian NMFRaj Sikarwar
 
Authentication in Different Scenarios
Authentication in Different ScenariosAuthentication in Different Scenarios
Authentication in Different ScenariosRaj Sikarwar
 
Authentication in Different Scenarios
Authentication in Different ScenariosAuthentication in Different Scenarios
Authentication in Different ScenariosRaj Sikarwar
 
AODV protocol and Black Hole attack
AODV protocol and Black Hole attackAODV protocol and Black Hole attack
AODV protocol and Black Hole attackRaj Sikarwar
 
Least squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierLeast squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierRaj Sikarwar
 

Mais de Raj Sikarwar (7)

Rabindranath tagore Biography
Rabindranath tagore BiographyRabindranath tagore Biography
Rabindranath tagore Biography
 
Overlapping community Detection Using Bayesian NMF
Overlapping community Detection Using Bayesian NMFOverlapping community Detection Using Bayesian NMF
Overlapping community Detection Using Bayesian NMF
 
Authentication in Different Scenarios
Authentication in Different ScenariosAuthentication in Different Scenarios
Authentication in Different Scenarios
 
Authentication in Different Scenarios
Authentication in Different ScenariosAuthentication in Different Scenarios
Authentication in Different Scenarios
 
AODV protocol
AODV protocolAODV protocol
AODV protocol
 
AODV protocol and Black Hole attack
AODV protocol and Black Hole attackAODV protocol and Black Hole attack
AODV protocol and Black Hole attack
 
Least squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierLeast squares support Vector Machine Classifier
Least squares support Vector Machine Classifier
 

Último

Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxnelietumpap1
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 

Último (20)

OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptx
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 

Hidden Surface Removal using Z-buffer

  • 1. CS 362: Graphics Scribe Hidden Surface Removal By: Rajkumar Singh (09010138) Akhilesh Shah (09010147) Course Instructors: Professor Pinaki Mitra Professor Samit Bhattacharya Department of Computer Science and Engineering Indian Institute of Technology, Guwahati 23rd April, 2012 1
  • 2. Abstract In Computer Graphics, Hidden surface determination also known as Visible Surface determination or hidden surface removal is the process used to determine which surfaces of a perticular object are not visible from a perticular angle or perticular viewpoint. In this scribe we will describe the object-space method and image space method. We will also discuss Algorithm based on Z-buffer method, A-buffer method, and Scan-Line Method. Keywords: A buufer, Z-buffer, depth buffer, Refresh buffer, Hidden Surface Re- moval(HSR). 1 Hidden Surface removal Visible Surface detection or Hidden surface removal algorithms are mainly classified according to whether they deal with object definations directly or the images that are obtained after projections. Broadly there are two types of methods shown as follows. 1.1 Object-Space Method In object-space method we took whole object under consideration. This method com- pares objects and parts of objects to each other within the scene defination to determine that which surfaces, as a whole are not visible from a certain viewpoint and so that can mark the surfaces as visible which are visible. 1.2 Image-Space Method In Image-Space algorithm, visibility is decided by point by point at each pixel position on the projection plane. Here we consider the projected image of the object not the object itself like object-space method. Visibility of unvisibility is decided by pixel by pixel or point by point traversal. Note: Most visible surface/ Hidden Surface removal algorithms use image-space methods, although object-space methods can also be used effectively to locate visible surfaces or to removing hidden surfaces. Eg. Line Diplaying algorithm uses object-space methods for displaying lines in a wire-frame. 2 Hidden Surface Removal Algorithms Algorithms are describes below. 2
  • 3. 2.1 Back-Face Detection A fast and simple object-space method for identifying the back face of poly-hedron is based on the Inside-Outside (given a point is inside or outside of a plane/surface). Let Plane Parameters are A, B, C, D then a point (x, y, z) will be inside to a plane or on the plane if Ax + By + Cz + D ≤ 0 When an inside point is along the line of sight to the surface the polygon must be a back face (it means we are inside that face and can not see its front from our viewing position). We can simplify the test by taking a vector let say N which is normal to a polygon surface. Let the components of N are (A, B, C). V is a vector in the viewing direction from the camera or eye position. As shown in Figure 1. The polygon will be back face if dot product of its normal vector and viewing vector is greater than zero.means V · N ≥ 0 If the object descriptions have been converted to projection coordinates and our viewing direction is parallal direction is parallel to the viewing Zv axis, then V = (0, 0, Vz) and V · N = VzC So that we only need to consider the sign of C, the z component of the normal vector N. Figure 1: Polyhedron In a Right handed viewing system with viewing direction along the negative zv axis, the polygon is a back face C < 0. Also viewer can not see any face whose normal has z component C = 0, since the viewing direction in the given example is grazing that polygon. This in general, we can label any face if its normal vector has a z-component value C ≤ 0. Some Observation based on the z-componet value. C < 0 Back face of the polyhedron C = 0 Grazing the polygon, Not visible from Viewing position C > 0 Face Visible from viewing position. Similar methods can be used for Left handed viewing position systems. In Left 3
  • 4. Handed Viewing systems back faces are identified by C ≥ 0 when the viewing dorec- tion is along the positive zv axis. By examining parameter C for the different planes defining an object, all the back faces can be identified. For a single convex polyhedron this test identifies all the hidden surfaces because each surface in polyhedron is either completely visible or completely hidden. And also if a scene contains only nonoverlapping convex polyhedra, then again all hidden surfaces are identified with the back method. For concave polyhedron more tests need to be carried out to determine whether there are additional faces that are totally or partially obscured by other faces. 2.2 Depth-Buffer Method This approach is based on image-Space method.In this method For detecting visible surfaces we compares surface depth at each pixel position on the projection plane. This procedure is also called as Z-buffer method, Since object depth is measured from the view plane along the z axisof a viewing system. Here Each surface of the scene is pro- cessed separately, one point time across the surface. The method is usually applied to scenes containing only polygon surfaces, because depth values can be computed very quickly and the method is easy to implement. But the method can be applied to non- planar surfaces as well. Below is the figure in which comparison between actual scene and z-buffer scene is shown. Figure 2: Comparison between Z-buffer and Actual Scene With the object descriptions converted to projection coordinates, each (x, y, z) position on a polygon surface corresponds to the orthographic projection point (x, y) on the view plane. Therefore, for each pixel position (x, y) on the view plane, object depths can be compared by comparing z values. Figure 3 shows the comparison between depths. this figure shows surfaces at varying distance along the orthographic projection line from postion (x, y) in a view plane taken as the xvyv plane. Surface S1 is the closest at this position, so its surface intensity at (x, y) is saved. We can implement the depth-buffer algorithm in normalized coordinates, so that z val- ues range from 0 at the black clipping plane to zmax at the front clipping plane. The plane of zmax can be set either to 1 (for unit cube) or to the largest value that can be 4
  • 5. stored on the system. As implied by the name of this method, two buffer areas are required. A depth buffer is used to store depth values for each (x, y) position as surfaces are processed, and the refresh buffer stores the intensity values for each position.Initially, all positions in the depth buffer are set to 0 (minimum depth), and the refresh buffer is initialized to the background intensity. Each surface listed in the polygon tables is then processed, one scan line at a time, calculating the depth (z depth) at each (x, y) pixel. The calculated depth is then compared with the value stored in the depth buffer. If it is greater than the value in depth buffer then this new calculated value is stored in the depth buffer. And the surfce intensity is calculated at the same position and stored in the same lo- cation in the (x, y) in the refresh buffer. Figure 3: Polygons at different depth In short whatever algorithm we discussed above is discussed below in small points. 1. Initialize the Depth buffer and refresh buffer with minus inifinty and background intensity respectively for all buffer positions (x, y). depth(x, y) = −∞ refresh(x, y) = Ibackground 2. For each position on each polygon surface, compare depth values to previouslystored valuesin the depth buffer to determine visibility. • Calculate the depth z for each position (x, y) on the polygon. • if z > depth(x, y), then set depth(x, y) = z, refresh(x, y) = Isurface(x, y) where Ibackground is the value for the background intensity, and Isurface(x, y) is the projectedd intensity value for the surface at pixel position (x, y). 5
  • 6. Equation of plane is: Ax + By + Cz + D = 0 (1) Depth values for a surface position (x, y) are calculated from the plane equation for each surface: z = −Ax − By − D C (2) For any scan line, adjacent hrizontal positions across the line differ by 1, and a vertical y value on an adjacent scan line differs by 1. if the depth of position (x, y) has been determined to be z, then the depth z of the next position (x + 1, y) along the scan line is obtained from given equation. z = −A(x + 1) − By − D C (3) ⇒ z = z − −A C (4) Figure 4: Depth at adjacent position The Ratio −A C is constant for each surface, so succeding depth values across a scan line are obtained from preceding values with a single addition. We first determine the y-coordinate the y-coordinate extends of each polygon, and process the surface from tha topmost scan line. Starting at a top vertex, we can recrsively calculate x position down a left edge of the polygon as x = x − 1 m , where m is the slope of the edge in the figure shown below. Figure 5: scan line and a polygon 6
  • 7. Depth values of the edge are obtained recursively as z = Ax −B(y−1)−D C x = y m − b m y − 1 = mx + b x = y−b−1 m x = x − 1 m ⇒ z = −(Ax+By+D) C + A m +B C ⇒ z = z + A m + B C (5) For Vertical left edge since m → ∞ hence we calculate z using given equation z = z + B C (6) Note: This Depth-buffer alias Z-buffer method works only for Opague surfaces, but for Transparent surfaces we have to store the vertical scan line as well as modifying z-buffer method and hence this method(modificationof z-buffer method) is known as A-buffer method. 2.3 A-Buffer Method The A-Buffer method represents an anti-aliased, area-averaged, accumulation buffer method. It was developed for implementation in the surface-rendering system called REYES (”Ren- ders Everything You Ever Saw”). A drawback of Depth-buffer method is that it deals only with opaque surfaces and cannot accumulate intensity values for more than one surface as required for displaying transparent surfaces. In A-buffer method, each position in buffer can reference a linked list of surfaces. Thus, multiple surface intensity can be considered at each pixel position and object edges can be anti-aliased. Each position in A-buffer has two fields: • Depth field - stores positive or negative real number. • Intensity field - stores surface intensity information or a pointer value. Figure 6: surface overlap 7
  • 8. Positive depth field signifies that the number stored at that position is the depth of single surface overlapping the corresponding pixel area. The intensity field stores the RGB com- ponent of surface colour at that point and the percent of pixel coverage. Negative depth field signifies multiple surface contribution to the intensity. The intensity field stores a pointer to linked list of surface data. Figure 7: Organization of A-buffer pixel position Data for each surface in the linked list includes: • RGB intensity component • opacity parameter (percent of transparency) • depth • percent of area covered • surface identifier • other surface rendering parameters • pointer to next surface Scan lines are processed to determine surface overlaps of pixel across the individual scan lines. Surfaces are subdivided into polygon mesh and clipped against the pixel boundaries. Opacity factors and percent of surface overlaps can be used to calculate the intensity of pixel as an average of the contributions from the overlapping surfaces. 2.4 Scan Line Method This is image-space based method.This is used for removing hidden surface and this is an extension of the scan-line algorithm for filling polygon interiors and here in this algorithm we deal with multiple surfaces, not just a single surface. To determine the visible surfaces, all polygon surfaces intersecting the scan line are examined. Across each scan line, surface nearest to the view plane is determined by making depth calculations. After that, intensity 8
  • 9. value for that position is entered into refresh buffer. Edge Table and Polygon Table are set up for various surfaces. Edge table contains:- • The x-coordinate of the end with the smaller y-coordinate. • The y-coordinate of the edges other end. • The x-increment . • The polygon identification number indicating the polygon to which the edge belongs. Figure 8: Edge Table Entry Polygon table contains:- • The coefficients of the plane equation. • Shading or colour information for the polygon. • An in-out boolean flag initialized to FALSE and used during the scan line processing. Figure 9: Polygon Table Entry In the below shown figure it is shown that how we actually locate the visible portions of surfaces for pixel position along the line. The active list for scan line 1 contains information from the edge table for edges AB, BD, EH and FG. For positions along this scan line between edges AB and BD, only the flag for surface S2 is on. Therefore, no depth calculation is required, and intensity information for surfaces S2. Figure 10: Scan Line Method for Hidden Surface Removal 9
  • 10. For the given example we will show the Active edge table entries. The example is as follows. Figure 11: Scan Line Method for Hidden Surface Removal Scan Line Entries a AB AC [In out flag of ABCisonBetweenABandAC b ABAC ABCon DEEF DEFon c ABD ABCon EBC ABC, DEFon EF DEFon c+1 ABD ABCon EBC ABC, DEFon EF DEFon c+2 ABBC ABCon DEEF DEFon Table 1: Active Edge Table for Scan line Method in Aove Example In the above example Between Edge DF and BC since Flags for both ABC and DEF are On, So we do not have to perform the depth calculation for these two. The above described algorithms(Scan line method does not work for the scenes in which there are cyclically overlapping of surfaces. One of such case is shown below. Figure 12: Cyclically Overlapping Surface 10
  • 11. 2.5 Depth-Sorting Method or Painter’s Algorithm This method uses both Image-space method and objects-space method. This method for solving the hidden-surface removal is also known as Painter’s Algorithm because of the similarity between the Painting creation and this algorithm execution. In this Algorithm we perform these given basic tasks. 1. All the surfaces are sorted in the order of decresing depths. This sorting is carried out in both image-space and Object-space. For this we use the deepest point on each surface for comparison. 2. Surfaces are scan converted in order, First we start with the surface of largest depth. This scan conversion of the polygon surfaces is performed in image-space or the pro- jected space. By refering the figure shown below. We perform certain tests that are as follows for each surface that overlaps with S (Shown in figure). Figure 13: Cyclically Overlapping Surface On performing following tests. if any of starting three tests is true no reasoning is necessary for that surface. The tests are listed in order of increasing difficulty. Test 1: The bounding rectangle of the two surfaces on xy−plane do not overlap, Test 2: Surface S us Completely behind the overlapping surface relative to the viewing position. 11
  • 12. In the above shown figure it can be checked by the sign of the endpoints. Test 3: The overlapping surface is completely in front of S relative to the viewing positions. Test 4: The projection of the two surfaces on the view plane do not overlap. 3 Quetions Question 1 In what case is scan line algorithm not applicable? Scan line Algorithm is not applicable in case of Intersecting surfaces or the cases of cyclic overlap as shown in figure below. Figure 14: Cyclically Overlapping Surface Question 2 Give an example of back face detection alogrithm. Consider an example of a cube as shown in figure below. n and V are the unit normal vector to any surface and unit vector in viewing direction respectively. In this example, 12
  • 13. V = -k, nA = k, nC = -k Figure 15: Example for Back Face Detection Algorithm If dot product of n and V is greater than or equal to zero, the surface is a back face. (n ·V) ≥ 0 In given example, nA · V = −1,nC · V = 1,nx · V = 0,for surfaces x = B, D, EandF. Hence, in the con- sidered example, only surface B is visible. Question 3 Give an example of Z-buffer algorithm (show the calculated depth and color). Consider the figure as shown below. Surface A (Red coloured) overlaps surface B (Blue coloured). Consider that surface A is at depth 3 and surface B at depth 5. Now, we prepare a table containing depth and intensity of any point for a surface determined by polygon EFGH. Figure 16: Example for Z-buffer Algorithm 13
  • 14. Figure 17: Table showing Depth and Intensity of each point Question 4 Give an example of A buffer Algorithm. Consider the figure as shown below. Transparent surface A (Red coloured) overlaps trans- parent surface B (Blue coloured) and surface C (Green coloured). Consider that surface A is at depth 3, surface B at depth 5 and surface C at depth 7. Now, we determine intensity at various points of considered polygon EFGH. Figure 18: Example for Z-buffer Algorithm For point A, +ve real number is stored in depth buffer field to signify one surface overlap. Intensity stored in intensity buffer field. However, for point G that has multiple surface overlaps, -ve real number is stored in depth field. The intensity field points to next node of the linked list that stores depth value and intensity of Surface A. Next pointer is to surface B and so forth. 14
  • 15. Figure 19: Example for A-buffer Algorithm Question 5 Explain Depth sorting method. • Both image and object space operations. • Perform the basic functions. • Surface are sorted in order of decreasing depth. • Surface are scan converted in order, starting with the surface of greatest depth. • Often referred as painters algorithm. • Test listed in the order of increasing difficulty. • Surfaces do not overlap. • Projections of the two surfaces on to the view plane do nor overlap. 15