Help build syntax highlighting for MediaWiki

177 views
Skip to first unread message

Jeremy Rudd

unread,
Aug 29, 2009, 2:24:10 AM8/29/09
to scite-interest
I would like to help create syntax highlighting for MediaWiki in SciTE
or preferably Scintilla itself, since I'm trying to help build support
for MediaWiki in Notepad++.

I'm quick at learning new programming/scripting languages and I would
really appreciate it if someone who is proficient in the ".properties"
language, (used to describe syntax highlighting, is it not?) would
help out.

The basic syntax that must be supported -- http://en.wikipedia.org/wiki/Wikipedia:Cheatsheet
-- also see my file with examples of all the basic syntax that can be
used for testing -- http://www.mediafire.com/file/twimkzi45nt/basic
mediawiki test.txt

Other advanced syntax such as "template" syntax, can be developed
later.

Past discussions relating to Notepad++

https://sourceforge.net/apps/ideatorrent/notepad-plus/ideatorrent/idea/65/

https://sourceforge.net/forum/forum.php?thread_id=3373522&forum_id=331753

Philippe Lhoste

unread,
Aug 29, 2009, 6:43:19 AM8/29/09
to scite-i...@googlegroups.com
On 29/08/2009 08:24, Jeremy Rudd wrote:
> I would like to help create syntax highlighting for MediaWiki in SciTE
> or preferably Scintilla itself, since I'm trying to help build support
> for MediaWiki in Notepad++.

Note: Neil just added a lexer for Markdown (announced in the Scintilla
ML, available in CVS), another "natural" markup language, it might be
interesting to look at it.

> I'm quick at learning new programming/scripting languages and I would
> really appreciate it if someone who is proficient in the ".properties"
> language, (used to describe syntax highlighting, is it not?) would
> help out.

No. Properties files are here to define the list of keywords, and the
styles, that's (mostly) all.
Lexers are written in C++, in the Scintilla component (SciLexer exactly).

--
Philippe Lhoste
-- (near) Paris -- France
-- http://Phi.Lho.free.fr
-- -- -- -- -- -- -- -- -- -- -- -- -- --

Randy Kramer

unread,
Aug 29, 2009, 8:05:15 AM8/29/09
to scite-i...@googlegroups.com
I'm a newbie to Scite/scintilla, C++, and writing lexers, but I thought
I'd chip in here, partly to see if I can explain what I know so far.
I'm trying to write a lexer for TWiki markup (well, that would cover
Foswiki as well, and some extensions I'm using on a project of my own).

Philippe seems to be one of the gurus of syntax highlighting, as he
answers quite a number of the questions raised on the subject.

He (of course) is right, the properties file is mostly not the place to
create syntax highlighting--things like the styles used in syntax are
defined in the property files, and lists of keywords (if there are any
in MediaWiki--I don't think there are any in TWiki, although I might
reconsider that).

What you need to do to "create" syntax hightlighting is create a lexer.
A lexer can be written in C++, Lua, or as an external lexer in almost
any language, iiuc.

I'll call the C++ lexer a native scintilla lexer--btw, all lexers are
part of scintilla (or interface with scintilla, as opposed to
interfacing to SciTE), so a project that uses scintilla (like
Notepad++) will get syntax highlighting as soon as scintilla does, plus
or minus things like updating the scintilla version used in Notepad++.

Lua is scintilla/SciTE's scripting language, and a lexer can be written
in it.

I don't know much about an external lexer.

I'm attempting to write a C++ lexer because I think it will be faster.

The approach I took so far was to pick another lexer (it happened to be
the YAML lexer) and duplicated it's C++ file (LexYAML.cxx) as the first
version of my TWiki lexer (i.e., LexTWiki.cxx) and got it to work with
SciTE/scintilla to lex YAML files. (Changes have to be made in other
files to get this to work--there is a document that is part of the
SciTE/scintilla distribution (or, at least, part of CVS) that lists the
steps (12, iirc) that are needed to do that. I found a few minor
errors in that, two that I've overcome, and one that I haven't so
far--I plan to share that with the list, but haven't done so far just
based on uncertainty--if you tried it and could see the same problems I
did, I'd feel more confident in presenting those issues.

Oh, at some point I should mention that a lexer (at least an internal
C++ lexer typically performs two functions, it lexes a document for
syntax highlighting purposes, and it lexes (I guess that's still the
right word) the document for folding purposes. I'd encourage you to
also work towards folding for MediaWiki--folding is such a great
thing--imho, everybody should learn to use folding. ;-)

There is another document in the distribution (or CVS) by the author of
Leo ( Edward K. Ream) that provides some insight into what the lexer
must do in a little more detail.

One problem I see with writing a lexer in C++ is that, so far, (afaics),
none have been written with a regex approach. I suspect it would be
possible to do so, and wonder why it hasn't so far. Possibly there is
too much overhead--the approach, in general, that a scintilla lexer
takes is to lex from the point that an edit occurs (i.e., for each
character typed) all the way to the end of the document. For short
documents that is not a problem.

I edit some rather large documents (at least, imho)--several are
currently at about 10 MB. For kicks, I tried lexing them with
the "converted" YAML lexer--near the end of the document there was no
problem, but when I edited near the beginning of the document I got
the "typing in molasses effect". That makes me worry about using
regexs.

Without using regexs though, you sort of have to create C++ logic that
does essentially the same thing. I'm thinking some of that stuff
through atm, in my head and with pencil and paper.

I get the feeling that a Lua lexer might be simpler, might even let you
use regexs (are they a built-in feature of Lua), but I fear it would be
slower.

My long files are comprised of multiple records. Syntax highlighting
doesn't extend past the boundary of a record, so I'm fairly sure that
I'll be able to limit my need for lexing to the end of the current
record, not all the way to the end of the file. I could be wrong,
however, and Neil Hodgson, for one seems to be skeptical. I do plan to
provide some way to force a lexing all the way to the end of the file
when necessary if the user notices that something doesn't look right.

<gratuitous digression on kate>

Aside: ime, kate is notorious for this, unless it was the way I wrote
the file that I had to write to make kate highlight and fold TWiki
markup--the fold markers are constantly getting out of sync with the
actual content of the document, sometimes I can force them to get back
in sync by saving the document, sometimes I can do things like delete a
section and then paste it back in exactly the same place, but sometimes
I have to actually reload the document, which is a royal pain because
lots of state (like the folding state) of the document is lost.

While I'm digressing about kate, kate has a known bug that makes it
necessary for me to add ending markup to make the document fold
properly--that is why I'm looking for a different editor. I'm almost
100% sure scintilla doesn't have the same problem, based partially on
the description of how folding works in scintilla--I can't recall (atm)
if I found a way to actually test that in scintilla without having a
working TWiki lexer. To expound on that a little bit, scintilla has
what I think is the right approach to folding--they assign a fold level
to almost every line (I forget if they assign a fold level to blank
lines or not, but they handle blank lines regardless). With a fold
level assigned to each line, scintilla can always find the end of a
folding region by a change in the fold level. I don't know (or
remember?) the details of how kate does it, but when I wanted to have
multiple fold regions end at the same point, I had to insert ending
markup for kate to work properly.

</end of gratuitous digression on kate>

I'm also pretty sure that I can write the lexer so that in many cases it
need lex only to the end of a line (much like the way most of the time
I think I only have to lex to the end of a record). Whether this is
worth the extra logic and complication that might be required will be
something I'll try to decide later.

Anyway, I hope that helps a little, and if you decide to try to write a
lexer for MediaWiki in C++, I'll offer you what help I can. Note
however, that this will be just about my first foray into programming
in C++ (or C, for that matter).

Randy Kramer

PS: Assuming you are interested, I can refer you to exactly the
documents that I referred to above--in fact, let me see if I can find
them...

"Add a lexer to Scintilla and SciTE": if you do a cvs checkout of SciTE,
this document is in scite/doc/SciTELexer.html--it is also on the
website, at http://www.scintilla.org/SciTELexer.html.

"How to write a scintilla lexer" is at
http://www.scintilla.org/Lexer.txt (it, and most of the other documents
I mention are also in cvs).

I'm pretty sure you'll need or want to read more than just these
documents--there are documents on an external lexer, a Lua lexer, on
the general design of scintilla, and on (sort of) the scintilla
internals (I guess that's what you'd call it).

Jeremy Rudd

unread,
Aug 29, 2009, 8:27:54 AM8/29/09
to scite-interest
On Aug 29, 3:43 pm, Philippe Lhoste <Phi...@GMX.net> wrote:

> Note: Neil just added a lexer for Markdown (announced in the Scintilla
> ML, available in CVS), another "natural" markup language, it might be
> interesting to look at it.

> Lexers are written in C++, in the Scintilla component (SciLexer exactly).


Hi Philippe,
Is there anyone who can "add" a lexer for MediaWiki? I'm sure there
are many experienced Scintilla open source contributors who might find
it quite easy to write the required code?

Unless someone can start me off on working with the SciLexer source
code, and show me the additions required per lexer (I've never
contributed to open source software!, but I can always try when I get
spare time)

Best regards, Jeremy

Randy Kramer

unread,
Aug 29, 2009, 8:28:36 AM8/29/09
to scite-i...@googlegroups.com
I really should (proof) read before I send--just fixed a few "grammos"
and "spellos".

I'm a newbie to Scite/scintilla, C++, and writing lexers, but I thought
I'd chip in here, partly to see if I can explain what I know so far.
I'm trying to write a lexer for TWiki markup (well, that would cover
Foswiki as well, and some extensions I'm using on a project of my own).

Philippe seems to be one of the gurus of syntax highlighting, as he
answers quite a number of the questions raised on the subject.

He (of course) is right, the properties file is mostly not the place to
create syntax highlighting--things like the styles used in syntax are
defined in the property files, and lists of keywords (if there are any
in MediaWiki--I don't think there are any in TWiki, although I might
reconsider that).

What you need to do to "create" syntax hightlighting is create a lexer.
A lexer can be written in C++, Lua, or as an external lexer in almost
any language, iiuc.

I'll call the C++ lexer a native scintilla lexer--btw, all lexers are
part of scintilla (or interface with scintilla, as opposed to
interfacing to SciTE), so a project that uses scintilla (like
Notepad++) will get syntax highlighting as soon as scintilla does, plus
or minus things like updating the scintilla version used in Notepad++.

Lua is scintilla/SciTE's scripting language, and a lexer can be written
in it.

I don't know much about an external lexer.

I'm attempting to write a C++ lexer because I think it will be faster.

The approach I took so far was to pick another lexer (it happened to be

the YAML lexer) and duplicated its C++ file (LexYAML.cxx) as the first

version of my TWiki lexer (i.e., LexTWiki.cxx) and got it to work with
SciTE/scintilla to lex YAML files.

Changes have to be made in other files to get this to work--there is a

document that is part of the SciTE/scintilla distribution (or, at
least, part of CVS) that lists the steps (12, iirc) that are needed to
do that. I found a few minor errors in that, two that I've overcome,
and one that I haven't so far--I plan to share that with the list, but
haven't done so far just based on uncertainty--if you tried it and
could see the same problems I did, I'd feel more confident in
presenting those issues.

At some point I should mention that a lexer (at least an internal
C++ lexer) typically performs two functions, it lexes a document for

syntax highlighting purposes, and it lexes (I guess that's still the
right word) the document for folding purposes. I'd encourage you to
also work towards folding for MediaWiki--folding is such a great
thing--imho, everybody should learn to use folding. ;-)

There is another document in the distribution (or CVS) by the author of
Leo ( Edward K. Ream) that provides some insight into what the lexer
must do in a little more detail.

One problem I see with writing a lexer in C++ is that, so far, (afaics),
none have been written with a regex approach. I suspect it would be
possible to do so, and wonder why it hasn't so far. Possibly there is
too much overhead--the approach, in general, that a scintilla lexer
takes is to lex from the point that an edit occurs (i.e., for each
character typed) all the way to the end of the document. For short
documents that is not a problem.

I edit some rather large documents (at least, imho)--several are
currently at about 10 MB. For kicks, I tried lexing them with
the "converted" YAML lexer--near the end of the document there was no
problem, but when I edited near the beginning of the document I got
the "typing in molasses effect". That makes me worry about using

regexes.

Without using regexes though, you sort of have to create C++ logic that

does essentially the same thing. I'm thinking some of that stuff
through atm, in my head and with pencil and paper.

I get the feeling that a Lua lexer might be simpler, might even let you

use regexes (are they a built-in feature of Lua), but I fear it would be
slower.

<gratuitous digression on kate>

Randy Kramer

> Note: Neil just added a lexer for Markdown (announced in the
> Scintilla ML, available in CVS), another "natural" markup language,
> it might be interesting to look at it.
>

> > I'm quick at learning new programming/scripting languages and I
> > would really appreciate it if someone who is proficient in the
> > ".properties" language, (used to describe syntax highlighting, is
> > it not?) would help out.
>
> No. Properties files are here to define the list of keywords, and the
> styles, that's (mostly) all.

Jeremy Rudd

unread,
Aug 29, 2009, 8:36:41 AM8/29/09
to scite-interest
Hi Randy Kramer!

Thanks for the amazing explanation and the great links to start me
off.

Unfortunately I've never worked with source code like this, so I'm
assuming I'll simply have to download the C++ SciTE and Scintilla
source code (??) and work on it offline using Visual Studio 2008 which
I have installed. Would it compile straight off on my WinXP machine or
would I have to play around a lot?

If its written in Lua, of which I know nothing, I would really need
your help in getting started!

Best regards, Jeremy.

Randy Kramer

unread,
Aug 29, 2009, 10:34:47 AM8/29/09
to scite-i...@googlegroups.com
On Saturday 29 August 2009 08:36:41 am Jeremy Rudd wrote:
> Hi Randy Kramer!
>
> Thanks for the amazing explanation and the great links to start me
> off.

You're welcome, but it sounds like I may have mislead you somewhat (not
intentionally). I'm very much a newbie to all this, and what I told
you is what I think I've learned.

More below...

> Unfortunately I've never worked with source code like this, so I'm
> assuming I'll simply have to download the C++ SciTE and Scintilla
> source code (??) and work on it offline using Visual Studio 2008
> which I have installed. Would it compile straight off on my WinXP
> machine or would I have to play around a lot?

I'm working in Linux, I'm fairly sure others are working in Windows
(even Neil, iiuc), but you need someone who's using Windows to answer
that kind of question.

> If its written in Lua, of which I know nothing, I would really need
> your help in getting started!

Can't be much help there, other than maybe to point you to some Lua
documentation--since my plan is to use C++, I don't have much interest
in Lua. (I do have a side interest, I do plan to write some macros,
and I've experimented with one, but my focus is not on writing a lexer
in Lua.

Also, don't get your hopes up as to a timeline--I'd have to think about
how long I've been thinking about or gradually starting to work on
this, it wouldn't surprise me if it's been 4 months already, and I
don't expect to finish in the next several months (partly because at
times I have to take my attention away from this).

I tried to give you an idea of what's involved, and I'm willing to try
to give you some help, but it's also sort of like the blind leading the
blind.

Sorry if I've built your hopes too high!

Randy Kramer


John Yeung

unread,
Aug 29, 2009, 11:17:22 AM8/29/09
to scite-i...@googlegroups.com
On Sat, Aug 29, 2009 at 8:36 AM, Jeremy Rudd<jrudd.d...@gmail.com> wrote:
> Unfortunately I've never worked with source code like this,
> so I'm assuming I'll simply have to download the C++ SciTE
> and Scintilla source code (??)

Yes.

> and work on it offline using Visual Studio 2008 which
> I have installed. Would it compile straight off on my
> WinXP machine or would I have to play around a lot?

One way to find out is simply to try it. ;) I expect it would work
right out of the box. I am a Windows user with virtually no C++
knowledge and I was able to compile Scintilla and SciTE using MinGW.
I don't recall needing any more information than what was in the
readme files.

John

Neil Hodgson

unread,
Aug 29, 2009, 9:04:09 PM8/29/09
to scite-i...@googlegroups.com
Randy Kramer:

> Lua is scintilla/SciTE's scripting language, and a lexer can be written
> in it.

There is no support in standard SciTE for writing a lexer in Lua.
This doesn't actually make it impossible since all a lexer really does
is assign a style number to each byte of text in the file so, since Lua
has access to setting style bytes it could do this. However, SciTE does
not call any Lua function when it receives a style-needed event so from
Lua you would have to try to use other events.

Mitchell Foral produces a variant SciTE-St that provides PEG (Parsing
Expression Grammar) support to allow implementation of lexers in Lua.
http://caladbolg.net/scite_st.php
http://en.wikipedia.org/wiki/Parsing_expression_grammar

The reason that SciTE does not support writing lexers in Lua is not
that I don't like the idea. Its that I think it should not just be a
low-level event like OnStyleNeeded(upToPosition) but that there should
be a nice high-level Lua helper object provided to the function and no
one has yet had time or desire to work on this.

> One problem I see with writing a lexer in C++ is that, so far, (afaics),
> none have been written with a regex approach. I suspect it would be
> possible to do so, and wonder why it hasn't so far.

Regular expressions have limited functionality and you will
eventually run into some code that you can not lex correctly. Many
editors have tried this route with sometimes good results on common
code. There are also often unhandled cases and it becomes more difficult
for more complex languages.

> While I'm digressing about kate, kate has a known bug that makes it
> necessary for me to add ending markup to make the document fold
> properly--that is why I'm looking for a different editor. I'm almost
> 100% sure scintilla doesn't have the same problem, based partially on
> the description of how folding works in scintilla--I can't recall (atm)
> if I found a way to actually test that in scintilla without having a
> working TWiki lexer.

There have also been issues with final line folding in Scintilla
since it is a special case.

Neil

Philippe Lhoste

unread,
Aug 30, 2009, 4:19:38 AM8/30/09
to scite-i...@googlegroups.com
On 30/08/2009 03:04, Neil Hodgson wrote:
> The reason that SciTE does not support writing lexers in Lua is not
> that I don't like the idea. Its that I think it should not just be a
> low-level event like OnStyleNeeded(upToPosition) but that there should
> be a nice high-level Lua helper object provided to the function and no
> one has yet had time or desire to work on this.

What are the specs/needs for such helper object? Something like
StyleContext?

Neil Hodgson

unread,
Aug 30, 2009, 5:18:09 AM8/30/09
to scite-i...@googlegroups.com
Philippe Lhoste:

> What are the specs/needs for such helper object? Something like
> StyleContext?

Yes.

Neil

Randy Kramer

unread,
Aug 30, 2009, 12:25:35 PM8/30/09
to scite-i...@googlegroups.com
Neil,

Thanks for the corrections / clarifications...

BTW, sometimes I go off on long tangents--no need for any response on
the kate stuff (or even to read it), I'm quite confident there won't be
a similar problem with Scintilla.

On Saturday 29 August 2009 09:04:09 pm Neil Hodgson wrote:
> Randy Kramer:


> Mitchell Foral produces a variant SciTE-St that provides PEG
> (Parsing Expression Grammar) support to allow implementation of
> lexers in Lua. http://caladbolg.net/scite_st.php
> http://en.wikipedia.org/wiki/Parsing_expression_grammar

Yup, I guess I got confused by having found this document (some time
ago):

http://caladbolg.net/luadoc/scite-st/modules/lexer.html

and then not remembering (recognizing?) that it was for SciTE-St.

And maybe also this document:

http://lua-users.org/wiki/SciteTextFolding

Which says:

"Outline Mode for Text Documents

"This is a simple extension which allows structured text documents to be
viewed in outline, by making folding available. It is an example of a
basic line-driven lexical styler. It currently requires SciteExtMan; to
install put Files:wiki_insecure/editors/SciTE/fold.lua in your
scite_lua directory."

I guess the key here is that this refers to a line-driven lexer, and
thus is mostly (if not wholly) suitable for folding (for which fold
levels are determined on a per line basis). (BTW, the wording is clear
in retrospect, just a little unclear perhaps for a newbie, not
recognizing the importance of the "line-driven" before the "lexical
scanner".) ;-)

Based on that, clearly, if you only wanted to fold, but not highlight
syntax, lua could be an appropriate choice.

Without knowing anything about SciTE-St, I think I'd prefer to stick
with just plain SciTE, on my assumption that its Scintilla (if there is
any difference between SciTE-St's Scintilla and the "base" Scintilla)
is the thing that is integrated into most editors that use Scintilla.
I wouldn't want to limit my choice in other editors using Scintilla,
although I presume that I could do the necessary things to make some
other editor use a non-standard Scintilla.

(I mention this because I recently found Geany which has some of the
features I was going to suggest for SciTE--I'm quite sure there are
other good editors out there using Scintilla, and I might prefer one of
those, or use a few different ones for slightly different purposes.)

> > One problem I see with writing a lexer in C++ is that, so far,
> > (afaics), none have been written with a regex approach. I suspect
> > it would be possible to do so, and wonder why it hasn't so far.
>
> Regular expressions have limited functionality and you will
> eventually run into some code that you can not lex correctly. Many
> editors have tried this route with sometimes good results on common
> code. There are also often unhandled cases and it becomes more
> difficult for more complex languages.

Yes, that could be the root of some of the anomalies I've found in the
nedit and kate syntax highlighting files that I've written.

> > While I'm digressing about kate, kate has a known bug that makes it
> > necessary for me to add ending markup to make the document fold
> > properly--that is why I'm looking for a different editor. I'm
> > almost 100% sure scintilla doesn't have the same problem, based
> > partially on the description of how folding works in scintilla--I
> > can't recall (atm) if I found a way to actually test that in
> > scintilla without having a working TWiki lexer.
>
> There have also been issues with final line folding in Scintilla
> since it is a special case.

Thanks for that--I'll watch for that problem, but, I should have been
more clear in my description of the problem I have in kate. The
problem is not with the final line in the file, but with each folding
beginning markup requiring corresponding ending markup.

(Many people don't see the problem in kate, because, in a lot of cases
(most programming languages?) there is naturally an ending marker for
each beginning marker--for example, much of the folding in C/C++ is
based on the { } pair--a fold starts at the { and ends at the }. (And,
even if they are nested, there is eventually a } for each {.
Similarly, begin / end blocks, if / [else /] endif blocks,
all "naturally" have ending markup. I'm pretty sure some programming
languages must have the problem--maybe nobody (or nobody with enough
influence) has tried to write a syntax highlighting / folding file for
kate. (BTW, I'm now remembering (if my memory is not tricking me)
that those files are written in XML.)

I'll explain the problem a little more, as best as I can remember it,
again, I'm almost certain Scintilla won't have the same problem.

I'll start with an example--suppose I have a document structured like
the following where each heading is a fold point. Aside: the "---+..."
is TWiki markup for headings, and in a "real" TWiki document, there
would probably be text and so forth below each heading, I'm not showing
any for the sake of clarity:

---+ a level 1 heading
---++ a level 2 heading
---+++ a level 3 heading
---++++ a level 4 heading
---+ the last level 1 heading

Aside: afaict, kate does a scan in real time for folding markup, it
doesn't keep a "vector" like Scintilla does of lines and their folding
level. I don't see that as a crucial difference in the problem that
I'm describing for kate, but might help explain why I describe things
the way I do.

Another aside: kate is very much line oriented, and has a hard time
searching for anything that crosses a line boundary, or maybe a better
way to describe it: if a search does cross a line boundary, it can't
backtrack--if I recall the words used by some of the kate people I
corresponded with, the "token gets eaten" and can't be rescanned.

I should also caution that I haven't gone back to refresh myself on the
details--this "explanation" is a little fuzzy on some of the details.
(BTW, I do have (somewhere) my notes on this experience, if someone is
really interested ... ;-)

So, back to the attempt at an explanation of kate's problem:

Note that the fold "regions" for the first headings at levels 1, 2, 3,
and 4, should all end at the (beginning) of the "last level 1 heading".

So, kate's folding logic scans forward to look for the next heading at
the same folding level as the current region (e.g., if kate is scanning
through the level 4 region looking for the end, it will end whenever it
finds the start of a new region at level 1, 2, 3, or 4).

The problem is (and here's where my memory is a little fuzzy), when kate
crosses the line boundary from the end of the level 4 region, to the
beginning of the level 1 region, its scanning routine "eats"
the "$---+ " for the new level 1 region. (Hmm, maybe I have that
backwards--maybe the scan for the end of the first level 1 region finds
the "$---+ " and eats it. Doesn't really matter, the point is ...) Now
that kate's scanner has moved into that new line, it can't backtrack to
find that same "$---+ " to end the other fold regions. (I'm definitely
fuzzy on the details.)

Anyway, I did write to a kate list, and got at least one response, none
suggesting any workaround. (I mean, obviously, at some level of coding
there would be a solution or workaround , but I was looking for a
solution without reprogramming kate's C++, in other words, staying
within the capabilities of the XML file used for defining syntax
hightlighting and folding.)

I was going to file a bug, but in going to do so, I found another bug
that essentially described the same root problem, but causing a
different (non-folding related) symptom, so I added a comment to that
bug. IIU/RC, a workaround was developed that would cure that other
symptom, but not help with my folding problem. :-(

I'm quite certain (but maybe not 100%) that Scintilla's approach to
folding will preclude a similar problem.

And, hopefully, I'll find out for sure, soon! ;-)

And, sorry for the noise. ;-)

Randy Kramer

Randy Kramer

unread,
Aug 30, 2009, 1:47:43 PM8/30/09
to scite-i...@googlegroups.com
Hmm, I probably confused anybody who tried to read this--somehow my
brain (or lack thereof) made me write $ instead of ^ to indicate the
beginning of a line (in a regex pattern).

And, the main bug on the kate list dealing with the issue is:

http://bugs.kde.org/show_bug.cgi?id=112888

Randy Kramer

Reply all
Reply to author
Forward
0 new messages