My basic "ScreenTile" class has all the info needed for the renderer
to draw the tile graphic including it's position on the grid. My data
model consists of several objects, or which many of them will come and
go. Players, monsters, items, projectiles and the map. None of these
have any knowledge of the Display (or ViewClass) and I don't want them
to. It's easy putting a flag in each object and setting it when that
object changes, ViewClass will check it and trigger a redraw if
needed. But my problem is how to handle the deltas. At present the
only way in my head is to have a "previous state" in each data object
so that the View can look at what to replace.
e.g.
Creature class;
position = x1.y1
previousposition = x2,y2
When creature moves set previous position. ViewClass then looks that
up and pulls the tile from the Map. Draws the old tile back in and
draws the creature. My concern about this approach is that there's a
lot of different types of data. The Map itself has FOV flags so for
that I would need to ensure the Map class keeps a record of tiles that
move "out of view" so that the ViewClass knows to draw them black (or
faded if they've been seen already). My other concern is the amount of
actions that the Creature Class contains. I need to make sure that the
"redraw" flag is set appropriately in all cases. I feel a spaghetti
nightmare coming on and it all seems very messy to me.
A simple strategy is to have the screen broken into regions (say a 4x4
grid), and, when something moves, mark the origin and destination
regions as "dirty". Then your main draw cycle picks the dirty regions
and redraws them, and flags them clean. Depending on the structure of
your drawing logic, you might have to also flag the moving entities
dirty, and redraw them as appropriate. If I'm reading your post
correctly, this is indeed the case. From what you describe, I assume
(hope!) all your actions use some common animation code -- put the
redraw flag logic there.
1) Erase objects
2) Run game logic
3) Draw objects
:)
Jotaf
I roll for disbelief that you problem is unnecessary redraws.
I would think netbooks should be able to support full screen
repaints. There is something wrong with how you are blitting your
layers.
Are you using SDL? GL? GTK? Software comping in python?
> My basic "ScreenTile" class has all the info needed for the renderer
> to draw the tile graphic including it's position on the grid. My data
> model consists of several objects, or which many of them will come and
> go. Players, monsters, items, projectiles and the map. None of these
> have any knowledge of the Display (or ViewClass) and I don't want them
> to.
Break into two layers: Sprites and Terrain. Sprites can be reblitted
at 60FPS, the terrain you can re-comp on terrain changes and otherwise
just blit as a bitmap. Dirty rectangles are cool, but when you start
overlaying UI elements for highlighting, adding ambient animations to
creatures, etc, you are likely going to be redrawing everything
anyways.
> It's easy putting a flag in each object and setting it when that
> object changes, ViewClass will check it and trigger a redraw if
> needed. But my problem is how to handle the deltas. At present the
> only way in my head is to have a "previous state" in each data object
> so that the View can look at what to replace.
You are violating your "Data model knows nothing about the view". The
*view* should handle any such caching! Not the data.
> When creature moves set previous position. ViewClass then looks that
> up and pulls the tile from the Map. Draws the old tile back in and
> draws the creature.
Except... You are double buffering, right? So you need to draw the
old tile back from two frames ago, not the current frame! So, yes,
maintaince nightmare.
--
Jeff Lait
(POWDER: http://www.zincland.com/powder)
> I roll for disbelief that you problem is unnecessary redraws.
> I would think netbooks should be able to support full screen
> repaints. There is something wrong with how you are blitting your
> layers.
You'd be right. Actually my entire pipeline is suffering from massive
amounts of array copying. It's all being rewritten.
> Are you using SDL? GL? GTK? Software comping in python?
SFML. The rendering slowdowns was down to me regenerating sf::string
objects every frame. This, compounded by the redundant copying further
back up the pipe caused 1fps on my netbook. Since I'm optimising I
might as well do it right.
> Break into two layers: Sprites and Terrain. Sprites can be reblitted
> at 60FPS, the terrain you can re-comp on terrain changes and otherwise
> just blit as a bitmap. Dirty rectangles are cool, but when you start
> overlaying UI elements for highlighting, adding ambient animations to
> creatures, etc, you are likely going to be redrawing everything
> anyways.
Hmm, this seems a good alternative. Terrain changes will only occur
when players move or on impacts. Everything else can be overlaid on
top. I'll look into this one.
> Except... You are double buffering, right? So you need to draw the
> old tile back from two frames ago, not the current frame! So, yes,
> maintaince nightmare.
SFML handles the buffering internally. I throw data at it as if it's
single buffered.
Good to hear! I didn't think those netbooks were that slow :>
> > Are you using SDL? GL? GTK? Software comping in python?
>
> SFML. The rendering slowdowns was down to me regenerating sf::string
> objects every frame. This, compounded by the redundant copying further
> back up the pipe caused 1fps on my netbook. Since I'm optimising I
> might as well do it right.
You might also consider making a braindead project where you directly
call the blit functions to draw all the tiles every frame with none of
your rendering engine overhead. This will give you an idea what the
speed of light is for your target frame rate and assess how much
overhead your engine is imposing.
> > Break into two layers: Sprites and Terrain. Sprites can be reblitted
> > at 60FPS, the terrain you can re-comp on terrain changes and otherwise
> > just blit as a bitmap. Dirty rectangles are cool, but when you start
> > overlaying UI elements for highlighting, adding ambient animations to
> > creatures, etc, you are likely going to be redrawing everything
> > anyways.
>
> Hmm, this seems a good alternative. Terrain changes will only occur
> when players move or on impacts. Everything else can be overlaid on
> top. I'll look into this one.
The big problem with this is that you throw away the ability to have
animated terrains.
Note you don't need to rebuild the terrain on player moves. You can
instead build a flat bitmap terrain slightly larger than the viewport,
centered on the player. Then, on a move, you only need to update the
strip of tiles that will enter/leave the viewport. To render, use 4
blits instead of one blit to deal with the fact your terrain is a 2D
ring buffer.
> > Except... You are double buffering, right? So you need to draw the
> > old tile back from two frames ago, not the current frame! So, yes,
> > maintaince nightmare.
>
> SFML handles the buffering internally. I throw data at it as if it's
> single buffered.
So you are already doing a full screen copy per frame right there.
Overdraw isn't your issue :>
Is this a big problem? What roguelike games use animated
terrains (or sprites) anyway?
I'm drawing only the areas that have changed and it's fast
enough even with SDL's software surface. For tiles it's
easy to use an array the size of view and flag the tiles that
were changed during the turn. Then draw only those changed
tiles. It really makes a difference in speed.
Now when I think of it, it could be nice to have an option
to use hardware surface..
You can easily software render the whole screen on the PCs of several
years ago. If you're writing in C++ or the like, you can software
render it explicitly in your own code.
Really, there's no need to fret about dirty rectangles and the like in
99% of cases.
- Gerry Quinn
Probably not. I'm sure SFML draw()'s objects to the current offscreen
buffer then when I call display() it switches buffers. I doubt there's
copy going on there. Anyways, after some testing I've switched from
using sf::string as they've proven to be far too slow. I'm now using
SFML sprites which are opengl accelerated surfaces. On the netbook
even using these to draw my tiles (3560 of them) it's still too slow,
but I've more optimisation to go. THis does mean though that I've got
to draw all my own graphics tiles, which tempts me away from pure
ASCII and makes this game even less of a roguelike than it already
isn't. (It's a squad based TBS).
Maybe the software rendering really is slow that you can
get more speed by updating only those areas that change.
This is the case from my experience with SDL. I don't care
about theory, I always try things and see what happens.
Even software blitting* with SDL should be very fast as long are you are
not using images with alphas or large resolutions; I get well over 100fps
with my low end laptop when rendering typical 80x30 pseudo terminal.
*) Does modern graphics hardware actually have that much of an support
for the kind of accelerated 2d that SDL offers? I'd guess that to get
most out of modern hw one should use opengl or dx.
--
tomppa
My game Lair of the Demon Ape uses my own software sprite engine - I
didn't use alphas in it but even if I had I'd have had no problem
getting a decent FPS (I've used them in full-screen animated games).
I've been told it runs a bit slow on WINE, which may or may not be down
to the graphics - but certainly it is fine on quite old Windows PCs.
I don't use dirty rectangles, just a full redraw to a buffer every
frame, followed by a blit to the playing area.
These days if you have trouble with graphics speed on a PC target
machine, you either have very high ambitions or are doing something
wrong, IMO.
- Gerry Quinn
Some people seem to think that everyone owns an updated or new PC,
which isn't true.
Besides in more complex games there are other things going on
than just drawing graphics. It's easy to say that drawing is fast
enough if your game is a simple 7DRL.
I was speaking of PCs that are (say) six or seven years old.
> Besides in more complex games there are other things going on
> than just drawing graphics. It's easy to say that drawing is fast
> enough if your game is a simple 7DRL.
Most of these other things, if correctly coded, are generally much
faster than graphics. But even if that applies to a particular game,
it only means that your optimisation time is better spent on something
other than graphics.
[Sluggish languages can have an effect, of course. One of the
advantages of using C++ or a similar language is that you don't have to
jump through hoops to optimise your code, which often compensates for
some of the supposed disadvantages in ease of coding.]
- Gerry Quinn
Well.. Quake which is 3D, full screen, real time, with music and
sounds runs perfectly on a Pentium 90 which came out in 1994..
I doubt roguelikes will become more complex than Quake or that we will
ever need to support hardware slower than P90.
T.
Those can't be compared, so please don't try that way.
It's like some people wondering why 3D rendering is so slow
compared to 3D games. They don't know what the fuck they are
talking about.
FWIW SDL was already fast enough for basic 2d stuff (like roguelikes)
over 10 years ago. Actually I think my old Matrox Millenium II equipped
PII/300mhz had considerably better non-3d-accelerated performance than
most machines I have owned after that.
There are many things that one can do in an roguelike to eat up processor
time but calculating and drawing the player viewport shouldn't really
have been a problem in roguelikes for years.
--
tomppa
This reminds me when I tried a recent version of Crawl with
tiles. It was so slow that I couldn't play it. It happens.
It's time to stop using your 486.
--
Derek
Game info and change log: http://sporkhack.com
Beta Server: telnet://sporkhack.com
IRC: irc.freenode.net, #sporkhack
..my Sempron 3000+ 1.80Ghz. Not yet. This is a nice computer
to make sure MY games are not slow.
..that's so loaded down with viruses it apparently runs like a 486.
Seriously, dude, dithering over performance to that degree is the
hallmark of "missing the forest for the trees".
I'm just saying that even roguelikes with console ascii output
can get slow (I've seen that in Nethack). When people talk
about the speed of modern PC they in fact don't know anything
about programming in reality. They never tried things and
they keep repeating mantras for god knows what reason.
I've got an idea. Why don't you define what you mean by "slow", and
then you can start making unsupported assertions about it?
I remember when Angband first went to LUA - the game went from
twitch-playable (just how I like it) to laggy. Especially the run commands.
But this has nothing do to with the graphics rendering. Bad structures
and middleware that expects faster CPUs are more likely culprits.
There's not much point optimising the graphics pipeline of a roguelike
unless you're absolutely positive that's where your bottleneck is.
--
mike blackney
mi...@mikeblackney.com
Grillen Verboten!
Precisely. Or embedding an interpreted language, say.
You'll lose far more performance having an AI that takes 50ms to
calculate optimal strategy. Which doesn't sound so bad until you apply
it to a big room where 20 monsters are attacking the player.
I'm currently working on a python roguelike using pyopengl. To get
around the re-rendering problem, I did a combination of two things:
1) Break the map up into chunks
2) load each chunk into OpenGL buffers using glGenBuffer() and draw
straight form GPU memory
As a result, every frame you issue draw commands, but you only have as
many draw commands as there are chunks and you aren't moving data from
the CPU to the GPU so it takes a relatively trivial amount of time.
In your game loop you iterate through the dirty chunks and update the
buffer on the GPU.
This resulted in a massive speedup. You'll have to learn some opengl
but it's well worth it.
If you are interested in the source code, let me know.
Digitalghost
On Jan 25, 7:25 am, skreeg <skr...@gmail.com> wrote:
> Hi folks,
> I've moved Traction Edge (my steampunk Xcomish game previously called
> EnemyX) development over to a Netbook and I've found my now n00bish
> year old rendering code is just rubbish and is very slow, so the whole
> thing is up for a rewrite. Following the MVC design pattern I've
> decided to implement some kind of delta rendering so only changes
> between frames are drawn. My problem is how to "detect" the changes.
>
> My basic "ScreenTile" class has all the info needed for the renderer
> to draw the tile graphic including it's position on the grid. My data
> model consists of several objects, or which many of them will come and
> go. Players, monsters, items, projectiles and the map. None of these
> have any knowledge of the Display (or ViewClass) and I don't want them
> to. It's easy putting a flag in each object and setting it when that
> object changes, ViewClass will check it and trigger a redraw if
> needed. But my problem is how to handle the deltas. At present the
> only way in my head is to have a "previous state" in each data object
> so that the View can look at what to replace.
>
> e.g.
>
> Creature class;
> position = x1.y1
> previousposition = x2,y2
>
> When creature moves set previous position. ViewClass then looks that
> up and pulls the tile from the Map. Draws the old tile back in and
Why in the world can't those be compared? If Quake could repaint the
screen in software every frame in 1994, shouldn't a roguelike be able
to now?
> On 2010-01-28, Krice <pau...@mbnet.fi> wrote:
>> On 28 tammi, 19:25, Tomi Neste <tomi.ne...@gmail.com> wrote:
>>> There are many things that one can do in an roguelike to eat
>>> up processor time but calculating and drawing the player
>>> viewport shouldn't really have been a problem in roguelikes
>>> for years.
>>
>> This reminds me when I tried a recent version of Crawl with
>> tiles. It was so slow that I couldn't play it. It happens.
>
> It's time to stop using your 486.
I had the same experience. Maybe it's something to do with lacking 2d
acceleration, but Crawl with tiles does not work at all well on one of
my machines, which is aging but not that much.
IMO, if you do graphics, you should do it right. Just putting small
pictures on the screen just doesn't cut it anymore. Even in the 80ies
this wasn't state of the art.
If you have graphics that resemble "real" objects and monsters, having
them animated is expected. I was actually disappointed when seeing
that the water, lava and clouds in Crawl *aren't* animated.
Nowadays a graphical RL should at least look like Rogue Touch:
http://www.chronosoft.com/ (Attention: youtube link of an iPhone app
inside)
That's the main reason an ASCII mode (or a graphical mode that
resembles an ASCII mode) is a big plus. You don't expect fancy
graphics if you see it.
Bye
Patric
--
NetHack-De: NetHack auf Deutsch - http://nethack-de.sf.net/
NetHack for AROS: http://sf.net/projects/nethack-aros/
UnNetHack: http://apps.sf.net/trac/unnethack/
Sure, it happens. But it was a bug that caused that slowdown. Try
again the newest version.
> Krice <pau...@mbnet.fi> wrote:
>> On 26 tammi, 16:31, Jeff Lait <torespondisfut...@hotmail.com> wrote:
>>> The big problem with this is that you throw away the ability to have
>>> animated terrains.
>>
>> Is this a big problem? What roguelike games use animated
>> terrains (or sprites) anyway?
>
> IMO, if you do graphics, you should do it right. Just putting small
> pictures on the screen just doesn't cut it anymore. Even in the 80ies
> this wasn't state of the art.
>
> If you have graphics that resemble "real" objects and monsters, having
> them animated is expected. I was actually disappointed when seeing
> that the water, lava and clouds in Crawl *aren't* animated.
Maybe it's because I play a lot of board games, but I don't have any
such expectation.
They are nowadays. :) Well, water and lava are, as well as mutagenic
clouds. On topic, this "animation" only became possibly when we
switched away from caching and to OpenGL, which incidentally also
caused the lags Krice mentions. The upside to these lags was that when
trying to optimise the rendering we found several memory holes (also
for ASCII) and several places where rendering happened much more often
than was necessary. Players still occasionally complain about lags,
but it usually comes down to outdated drivers and updating them
appears to make the problem go away. No idea if it's really fixed,
though, as I haven't been able to reproduce this for ages. I guess
we'll find out when 0.6 comes out. :(
Johanna
On Fri, 29 Jan 2010 15:59:37 -0600, Paul Donnelly wrote:
> Patric Mueller <bh...@bigfoot.com> writes:
> > IMO, if you do graphics, you should do it right. Just putting small
> > pictures on the screen just doesn't cut it anymore. Even in the 80ies
> > this wasn't state of the art.
> > If you have graphics that resemble "real" objects and monsters, having
> > them animated is expected. I was actually disappointed when seeing
> > that the water, lava and clouds in Crawl *aren't* animated.
> Maybe it's because I play a lot of board games, but I don't have any
> such expectation.
Neither do I.
Ad Astra!
JuL
--
jyn...@gmx.de / Work like you don't need the money
J�rgen ,,JuL'' Lerch / Dance like no one was watching
/ Love like you've never been hurt
/ - ?
Dude, lots of us posting here have *made* roguelikes (and/or other
games).
- Gerry Quinn
I'm just saying that software blitting (true software blitting)
is kind of slow. It's a different story if a graphics routine
is accelerated and done by hardware. Try it and see it yourself.
In SDL use SDL_SWSURFACE flag for surfaces.
Mongrol is indeed me. I've solved my problem for the time being by
removing all usage of sf::String and moving entirely to sprites. I now
get a consistent 3-5fps drawing about 2600 sprites. This good enough
for development and I'll no doubt get more increases as I rewrite my
pipeline. I've been doing all sorts of silly things since I started C+
+ a year ago and expect good improvements. SFML2 is not an option as
OSX is my primary platform.
As usual, lot's of good replies and discussion, thanks all.
I already do software blitting in animated games. I write to
DIBSections with standard C++ code. Lair of the Demon Ape uses it.
It's more than fast enough for roguelikes, or puzzle games in general -
I can get 30 fps full screen on quite old machines.
- Gerry Quinn
To be fair, on a Pentium 100, Quake slowed to a treacle-like crawl if
you tried to run it at anything other than 320x200.
--
\_\/_/ turbulence is certainty turbulence is friction between you and me
\ / every time we try to impose order we create chaos
\/ -- Killing Joke, "Mathematics of Chaos"
This is a field in which pixel-graphics roguelikes are variously either
20 or 27 years out of date.
Origin Systems Inc.'s _Ultima III_, released for the Apple II in August
1983, had animated terrain and character figures.
Origin Systems Inc.'s _Ultima VI_, released for MS-DOS IBM-compatible
PCs in 1990, had _overlaid_ animated terrain and character figures.
To be honest I prefer ascii (with or without animations) to ultima III's
graphics.
Most roguelikes are a one-man (or woman) operation, and that person is
normally a programmer. So the choice is not ascii or good looking
tiles, it's ascii or horrible looking tiles.
And even if you have nice looking tiles it is much easier to get away
with not having animations with ascii than with tiles (look at crawl for
example - it has much better tiles than most roguelike & the ascii
version still looks better).
-Ido.
It's probably worth to add 'IMHO' to last sentence :)
I think tiles version is nicer and more informative.
Anyone tried my Wizard's Quest?
While it's ascii (well, mostly. there are several unicode characters),
I tried to get advantage of graphical backend and added some informative
animations... Kind of compromise between tiled and terminal displays.
I tried to but got the following error message:
-Ido.
It crashed right at the start, I didn't see anything except for the error.
> Paul Donnelly <paul-d...@sbcglobal.net> wrote:
>>Why in the world can't those be compared? If Quake could repaint the
>>screen in software every frame in 1994, shouldn't a roguelike be able
>>to now?
>
> To be fair, on a Pentium 100, Quake slowed to a treacle-like crawl if
> you tried to run it at anything other than 320x200.
Yeah, but you're probably not running a Pentium 100 now.
Since I'm forced to use sprites then my choices are actually horrible
looking tiles, or horrible looking ascii, since I have to draw, render
or transform some ascii from somewhere. So I've went with horrible
looking tile's instead.