BBBBBBB
BBBBBBB
BBBBBBB
CHAPTER 1
1.1 Introduction
A simple particle system in computer graphics is a technique used to simulate fuzzy phenomena such
as fire, smoke, and explosions. In this mini project, we create a basic particle system to visualize particles
emitting from a source, moving according to physics-based rules. Each particle has properties like
position, velocity, lifespan, and color, which are updated over time to produce dynamic and realistic
effects. The project involves initializing particles, updating their states, and rendering them to create
visually appealing animations. This hands-on experience helps understand the fundamental concepts of
particle systems and their applications in graphics programming.
The term computer graphics has been used in a broad sense to describe “almost everything on
computers that is not text or sound. It is one of the most powerful and interesting facts of computer.There is
a lot that you can do apart from drawing figures of various shapes.
Today, computers and computer-generated images touch many aspects of daily life. Computer image is
found on television, in newspapers, for example in weather reports, in all kindsof medical investigation
and surgical procedures. A well-constructed graph can present complex statistics in a form that is easier
to understand and interpret. In the media such graphs are used to illustrate papers, reports, and other
presentation material.
Many powerful tools have been developed to visualize data. Computer generated imagery can be
categorized into several different types: 2D, 3D, 5D, and animated graphics. As technology has improved,
3D computer graphics have become more common, but 2D computer graphics are still widely used.
Computer graphics has emerged as a sub-field of computer sciencewhich studies methods for digitally
synthesizing and manipulating visual content. Over the past decade, other specialized fields have been
developed like information visualization, and scientific visualization more concerned with “the
visualization of three-dimensional phenomena” (architectural, meteorological, medical, biological, etc.),
where emphasis is on realistic renderings of volumes, surfaces, illumination sources, and so forth, perhaps
with a dynamic (time component).
Computer graphics has experienced significant advancements since its inception in the mid-20th
century.
• The concept of computer graphics emerged with the development of the first electronic
computers in the 1940s.
• The 1970s saw the development of raster graphics systems, enabling the representation
of images as grids of pixels.
• The release of the video game "Pong" by Atari in 1972 demonstrated the potential of
interactive computer graphics in entertainment.
• The 1980s marked the introduction of dedicated graphics hardware and algorithms for
rendering 3D objects, improving the realism of images.
• The 1982 movie "Tron" showcased the use of computer-generated imagery (CGI) in
film.
• The 1990s saw the development of powerful graphics software like Adobe Photoshop
and 3D Studio Max.
• The rise of powerful graphics processing units (GPUs) in the 2000s enabled real-time
rendering of complex 3D graphics, impacting video games, virtual reality (VR), and
augmented reality (AR).
Throughout its history, computer graphics has continually pushed the boundaries of visual content
creation, transforming industries such as entertainment, design, and science. The field continues to evolve,
driven by innovations in hardware, software, and algorithms.
Nowadays Computer Graphics used in almost all the areas ranges from science, engineering, medicine,
business, industry government, art, entertainment, education, and training.
Computer Aided design methods are routinely used in the design of buildings, automobiles,
aircraft, watercraft, spacecraft computers, textiles, and many other applications.
Another major application area presentation graphics used to produce illustrations for reports
or generate slides. Presentation graphics is commonly used to summarize financial, statistical,
mathematical, scientific data for research reports and other types of reports.2D and 3D bar chart
to illustrate some mathematical or statistical report.
1.6 Objectives
A. Understand Particle Systems: Gain a comprehensive understanding of particle systems and their
applications in computer graphics.
B. Initialize Particles: Develop the ability to initialize particles with properties such as position,
velocity, lifespan, and color.
C. Implement Physics-Based Rules: Learn to apply physics-based rules to update particle states over
time for realistic behavior.
D. Rendering Techniques: Explore techniques for rendering particles dynamically to create visually
appealing animations.
E. Simulation of Natural Phenomena: Simulate natural phenomena like fire, smoke, and explosions
using particle systems.
CHAPTER 2
INTRODUCTION TO OPENCV
OpenCV's functions in Python are the fundamental building blocks of the library, providing low-level
operations for manipulating and processing images and matrices. These functions form the backbone of
OpenCV and are used by many of the higher-level functions and algorithms in the library. OpenCV
provides functions for reading and writing images and videos in various formats. OpenCV also provides
a set of basic image processing functions, such as image arithmetic, pixel manipulation, and colour space
conversion.
Primitives are defined by a group of one or more vertices. A vertex defines a point, an endpoint
of a line, or a corner of a polygon where two edges meet. Data (consisting of vertex coordinates,
colours, normal, texture coordinates, and edge flags) is associated with a vertex, and each
vertex and its associated data are processed independently, in order, and in the same way. The
only exception to this rule is if the group of vertices must be clipped so that a particular
primitive fits within a specified region; in this case, vertex data may be modified and new
vertices created. The type of clipping depends on which primitive the group of vertices
represents. You can control modes independently of each other; that is, setting one mode does
not affect whether other modes are set.
Commands are always processed in the order in which they are received, although there may
be an indeterminate delay before a command takes effect.
The effects of OpenCV commands on the frame buffer are ultimately controlled by the window
system that allocates frame buffer resources. The window system determines which portions
of the frame buffer OpenCV may access at any given time and communicates to OpenCV how
those portions are structured. Therefore, there are no OpenCV commands to configure the
frame buffer or initialize OpenCV.
As shown by the first block in the diagram, rather than having all commands proceed
immediately through the pipeline, you can choose to accumulate some of them in a display list
for processing later.
The evaluator stage of processing provides an efficient means for approximating curve and
surface geometry by evaluating polynomial commands of input values. During the next stage,
per-vertex operations, and primitive assembly, OpenCV processes geometric primitives—
points, line segments, and polygons, all of which are described by vertices. Vertices are
transformed and lit, and primitives are clipped to the viewport in preparation for the next stage.
Rasterization produces a series of frame buffer addresses and associated values using a two-
dimensional description of a point, line segment, or polygon. Each fragment so produced is fed
into the last stage, per-fragment operations, which performs the final operations on the data
before it is stored as pixels in the frame buffer. These operations include conditional updates to
the frame buffer based on incoming and previously stored z-values (for z-buffering) and
blending of incoming pixel colours with stored colours, as well as masking and other logical
operations on pixel values.
Input data can be in the form of pixels rather than vertices. Such data, which might describe an
image for use in texture mapping, skips the first stage of processing described above and instead
is processed as pixels, in the pixel operations stage. The result of this stage is either stored as
texture memory, for use in the rasterization stage, or rasterized and the resulting fragments
merged into the frame buffer just as if they were generated from geometric data.
All elements of OpenCV state, including the contents of the texture memory and even of the
frame buffer, can be obtained by an OpenCV application.
1. Core Functionality: Handles image representation (cv: Mat), geometric entities (cv::Point,
cv::Rect, cv::Size), and basic operations.
2. Image Processing: Provides functions for image I/O, color conversion, filtering, and convolution.
4. Object Detection and Tracking: Includes tools for cascade classifiers (cv::CascadeClassifier) and
deep learning models (DNN module) for object detection.
5. Video Analysis: Offers functionality for video I/O, background subtraction, and frame processing.
6. High-Level GUI and Utilities: Provides drawing functions (cv::line, cv::rectangle, etc.) for
annotations, GUI utilities (cv::imshow, cv::waitKey), and optimization tools.
OpenCV supports multiple programming languages, including C++, Python, Java, and MATLAB, making
it widely accessible for cross-platform development and integration into various applications.
CHAPTER 3
SYSTEM DESIGN
3.2 ARCHITECTURE
3.3 FLOWCHART
This flowchart represents the control flow and functionality of a particle system program. It begins
with the Start block, followed by the Main(), Init(), and Display() functions. User interactions are handled
through various keyboard functions. Pressing 'f' toggles fog on or off, while 't' changes the spray type to
either waterfall or fountain. The 's' key adds collision spheres, with a maximum of three spheres. The '-'
key decreases the particle flow. Pressing 'p' reduces the size of the particles, whereas 'P' increases their
size. The 'l' key toggles between points and lines rendering modes. The '#' key toggles the frame rate
display on or off, and the '~' key toggles full screen mode. This flowchart ensures an interactive and
versatile particle system simulation.
A. Visual Effects Creation: Provides experience in creating dynamic visual effects like fire, smoke,
and water, which are widely used in media and entertainment.
B. Interactive Learning: Enhances understanding of computer graphics concepts through hands-on
interaction with particle systems.
C. User-Controlled Features: Allows users to manipulate various parameters, fostering
experimentation and deeper comprehension of particle behavior.
D. Programming Skills: Strengthens coding skills in graphics programming, particularly in
managing and optimizing particle systems.
E. Versatility in Applications: Offers insights into applications of particle systems in different
fields, including virtual reality, special effects, and scientific visualization.
F. Problem-Solving and Creativity: Encourages creative problem-solving by addressing
challenges in designing and implementing particle interactions and behaviors.
G. Performance Optimization: Teaches optimization techniques to manage and render a large
number of particles efficiently, enhancing performance skills.
3.1 DISADVANTAGES
A. Performance Constraints: High computational requirements can slow down the system,
especially with a large number of particles, affecting real-time performance.
B. Complexity in Implementation: Developing and managing a particle system can be complex
and time-consuming, requiring a strong grasp of graphics programming and physics.
C. Limited Realism: Simple particle systems might not achieve the high level of realism needed for
advanced applications without significant enhancements.
D. Resource Intensive: Intensive use of memory and processing power can limit the application on
lower-end hardware.
E. Debugging Challenges: Troubleshooting and debugging particle systems can be difficult due to
the dynamic and concurrent nature of particle updates and rendering.
F. Scalability Issues: As the number of particles increases, maintaining performance and visual
quality can become problematic.
G. Specialized Knowledge Required: Requires specialized knowledge in computer graphics,
physics, and optimization techniques, which might be a barrier for beginners.
CHAPTER 4
REQUIREMENTS SPECIFICATION
To perform this project in graphics using OpenGL, certain hardware and software requirements are
required by the system. These hardware and software requirements are listed as below:
CHAPTER 5
PROJECT DESCRIPTION
The simple particle system involves developing an interactive particle system to simulate natural
phenomena like fire, smoke, and water. The system initializes particles with properties such as position,
velocity, lifespan, and color, and updates these properties in real-time based on physics-based rules. Users
can interact with the system through various keyboard controls, allowing them to toggle fog effects,
change spray types, add collision spheres, adjust particle size and flow, and switch between different
rendering modes. The project aims to provide hands-on experience in graphics programming, enhance
understanding of particle dynamics, and explore optimization techniques for real-time simulation. Despite
its complexity and resource-intensive nature, this project offers valuable insights into creating dynamic
visual effects and managing performance in computer graphics applications.
Header Files:
The in-built functions are defined in the OpenGL library. Some of the headers that are
used are as follows:
• #include<stdio.h> :to take input from standard input and write to standard output
Inbuilt Functions
• Specifies clear values to the color buffers and clears the display before redrawing it.
• glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT)
• Void glutDisplayFunc(void(*func)(void))
• glFlush()
• Forces any buffered OpenGL commands to execute. Sets keyboard call back to current
window.
• Initiates a new primitive of type mode and starts collection of vertices. Values of mode
include GL_POINTS, GL_LINES and GL_POLYGON.
• void glEnd()
• void glutInit ( )
• All Initializes the GLUT. The arguments from main are passed in and can beused by the application.
• void glutCreateWindow( )
• Creates a window on the display. The string title can be used to label the window.
• void glutInitDisplayMode()
• Void glutInitWindowSize( )
• Specifies the initial position of the top-left corner of the window in pixels.
• void glutMainLoop()
• Causes the program to enter an event-processing loop. It should be the laststatement main.
• void glutKeyboardFunc()
• timedelta(void);
• This function returns the number of seconds that have elapsed since the previous call to the
function.
• void text( );
• This function draws a string of text with an 18 point Helvetica bitmap space.
• int fequal( );
• void psTimestep( );
• void psNewparticle( );
• void psBounce( );
• psBounce: the particle has gone past (or exactly hit) the groundplane .
• void psBounce( );
• psBounce the particle has gone past (or exactly hit) the ground
plane, so calculate the time at which the particle actually
5.2 IMPLEMENTATION
#include <windows.h>
#include <GL/glut.h>
#include "bits/stdc++.h"
#include "testlib.h" //Download this header file from
https://raw.githubusercontent.com/MikeMirzayanov/testlib/master/testlib.h
using namespace std;
bool collide;
bool game_over;
long long score;
class Point {
public:
double x, y, z;
Point() { x = y = z = 0; }
Point(double _x, double _y) { x = _x, y = _y, z = 0; }
Point(double _x, double _y, double _z) { x = _x, y = _y, z = _z; }
};
class Color {
public:
double r, g, b;
Color() { update(); }
Color(double _r, double _g, double _b) { r = _r, g = _g, b = _b; }
void update() { r = rnd.next(1.0), g = rnd.next(1.0), b = rnd.next(1.0); }
};
void renovate() {
x = rnd.any(Lanes), y = Road_top;
speed = rnd.next(0.1, .2);
color.update(), tolor.update();}
glBegin(GL_LINES);
glVertex2d(x + 0, y + 2);
glVertex2d(x + 1, y + 3);
glVertex2d(x + 0, y + 5);
glVertex2d(x + 1, y + 4);
glVertex2d(x + 2, y + 4);
glVertex2d(x + 3, y + 5);
glVertex2d(x + 2, y + 3);
glVertex2d(x + 3, y + 2);
glEnd();
glBegin(GL_QUADS);
glVertex2d(x + 0.25, y + 5.5);
glVertex2d(x + 2.75, y + 5.5);
glVertex2d(x + 3, y + 5);
glVertex2d(x + 0, y + 5);
glVertex2d(x + 0, y + 2);
glVertex2d(x + 3, y + 2);
glVertex2d(x + 2.6, y);
glVertex2d(x + 0.4, y);
glEnd();
glPointSize(10);
glBegin(GL_POINTS);
glVertex2d(2.75 + x, y + 1);
glVertex2d(0.25 + x, y + 1);
glEnd();
glPopMatrix();
return;
}
glRectd(x + 0, y + .5, x + 3, y + 3.5);
glBegin(GL_LINES);
glVertex2d(x + 0, y + .5);
glVertex2d(x + 1, y + 1.5);
glVertex2d(x + 0, y + 3.5);
glVertex2d(x + 1, y + 2.5);
glVertex2d(x + 2, y + 2.5);
glVertex2d(x + 3, y + 3.5);
glVertex2d(x + 2, y + 1.5);
glBegin(GL_QUADS);
glVertex2d(x + 0.25, y);
glVertex2d(x + 2.75, y);
glVertex2d(x + 3, y + 0.5);
glVertex2d(x + 0, y + 0.5);
glVertex2d(x + 0, y + 3.5);
glVertex2d(x + 3, y + 3.5);
glVertex2d(x + 2.6, y + 5.5);
glVertex2d(x + 0.4, y + 5.5);
glEnd();
glPointSize(10);
glBegin(GL_POINTS);
glVertex2d(2.75 + x, y + 4.5);
glVertex2d(0.25 + x, y + 4.5);
glEnd();
glPopMatrix();
}
}PlayerCar(1, 0, 0.1);
vector<Car>Cars(5);
class Road {
public:
vector<double> Dash;
Road() { for (int i = 0; i <= Road_top + 4; i += 4) Dash.push_back(i); }
void update() {
glColor3d(.25, .25, .25);
glRectd(Road_left, Road_bottom, Road_right, Road_top);
glColor3d(1, 1, 0.25);
glRectd(Road_left + 0.5, Road_bottom, Road_left + 1, Road_top);
glRectd(Road_left + 8, Road_bottom, Road_left + 8.5, Road_top);
glRectd(Road_left + 16, Road_bottom, Road_left + 16.5, Road_top);
glRectd(Road_left + 24, Road_bottom, Road_left + 24.5, Road_top);
glColor3d(1, 1, 1);
for (auto& d : Dash) {
d -= PlayerCar.speed; if (d < 0)d = Road_top + 4;
glRectd(Road_left + 4, d, Road_left + 4.5, d - 2);
glRectd(Road_left + 12, d, Road_left + 12.5, d - 2);
glRectd(Road_left + 20, d, Road_left + 20.5, d - 2);
}
}
}Roads;
class Tree {
public:
double x, y;
Tree() { renovate(); }
void renovate() {
x = (rnd.next(2) ? rnd.next(Ortho_left, Road_left) : rnd.next(Road_right, Ortho_right));
y = rnd.next(Ortho_top, Ortho_top * 2);
}
void update() {
y -= PlayerCar.speed;
if (y <= -6)renovate();
glColor3d(.13, 1, .13);
glBegin(GL_TRIANGLES);
glVertex2d(0 + x, 5 + y);
glVertex2d(-1 + x, 4 + y);
glVertex2d(1 + x, 4 + y);
glVertex2d(0 + x, 4.4 + y);
glVertex2d(-2 + x, 3 + y);
glVertex2d(2 + x, 3 + y);
glVertex2d(0 + x, 3.8 + y);
glVertex2d(-2.5 + x, 2 + y);
glVertex2d(2.5 + x, 2 + y);
glEnd();
}
}; vector<Tree>Trees(30);
void display() {
glClear(GL_COLOR_BUFFER_BIT);
Roads.update();
PlayerCar.update();
for (auto& car : Cars) { if (!collide)car.go_down(); car.update(); }
sort(Trees.begin(), Trees.end(), [](auto& a, auto& b) {return a.y > b.y;});
for (auto& tree : Trees)tree.update();
if (game_over) {
glColor3d(.7, .7, .7); glRectd(-7.3, 17, 7.5, 22);
glColor3d(0, 0, 1), drawstring(-4, 21, 0, "Game Over");
glColor3d(0, 0, 1), drawstring(-3, 19.5, 0, "Score: " + toString(score));
glColor3d(1, 0, 0), drawstring(-7, 18, 0, "Press R to Restart");
}
else {
glColor3d(.7, .7, .7); glRectd(Ortho_left, Ortho_top - 1, Ortho_left + 10, Ortho_top - 3);
glColor3d(0, 0, 1), drawstring(Ortho_left + 2, Ortho_top - 2, 0, "Score: " + toString(score));
}
bool collision() {
for (auto c : Cars)if (rectRect(c.x, c.y, PlayerCar.x, PlayerCar.y))return true;
return false;
}
void timer(int v) {
if (collision()) {
game_over = 1;
collide = 1;
glutPostRedisplay();
glutSpecialFunc(NULL);
return;
}
glutPostRedisplay();
glutTimerFunc(10, timer, v);
}
// Reshape Window
glViewport(0, 0, width, height);
glLoadIdentity();
return 0;
}
CHAPTER 7