SpeedOne of Leo's "grand challenges" is to re-imagine Leo with an outline containing millions of nodes. Whether this challenge even makes sense is an open question. But one thing is clear: a speedup of 2-10 times would be inconsequential. Therefore, rust can not possibly be part of the solution.
My challenge is to try to understand how one might profitably use very large outlines. I have no clear pictures in mind :-)
Very large collections are best thought of a graphs, IMO, because there are usually many types of connections between them - depending of course on the type and intended use of the entries. However, treelike *views* into the data are very often much better for a human to work with. With large collections, it can take a long time to create a view from scratch, so it is helpful to create the most important ones in advance. In the database world, these creation of such views are helped by indexes, temporary tables, and database views. In Python (and other languages that have native map structures), dictionaries can play that role.With increasing size, finding something becomes harder. It may well be that for Leo, once it can work with very large numbers of nodes, that we will need new and faster ways to find items and peruse them.Another issue of size is the amount of data that a single node can hold. I recently crashed Leo by trying to read some 80 megabytes of text into the body of a node. I was curious how fast it could do a search and replace on that much data, but I didn't find out because of the crash. Of course, we are currently limited by Qt's capabilities, and Leo may never need to do such a thing, so it may not matter.
Very large collections are best thought of a graphs, IMO, because there are usually many types of connections between them - depending of course on the type and intended use of the entries. However, treelike *views* into the data are very often much better for a human to work with. With large collections, it can take a long time to create a view from scratch, so it is helpful to create the most important ones in advance. In the database world, these creation of such views are helped by indexes, temporary tables, and database views. In Python (and other languages that have native map structures), dictionaries can play that role.With increasing size, finding something becomes harder. It may well be that for Leo, once it can work with very large numbers of nodes, that we will need new and faster ways to find items and peruse them.Another issue of size is the amount of data that a single node can hold. I recently crashed Leo by trying to read some 80 megabytes of text into the body of a node. I was curious how fast it could do a search and replace on that much data, but I didn't find out because of the crash. Of course, we are currently limited by Qt's capabilities, and Leo may never need to do such a thing, so it may not matter.