Direction on adding new language/frameworks

13 views
Skip to first unread message

Brad Wood

unread,
Jan 26, 2021, 11:50:13 PM1/26/21
to framework-benchmarks
Hello all, I recall this project from back probably in 2013 when I looked at adding my favoriate language, CFML, to it.  The setup back then was pretty rough and I think I lost interesting after a bit.  I found this project again today and I see it's come a very long way and has run a lot more rounds!

I'd like to poke again at getting CFML added, but I have some questions about how it would be best organized.  CFML (or ColdFusion) is a language supported by at least two major vendors.  Adobe ColdFusion and Lucee Server.  And while CFML is a language, it's also considered a basic web framework as it has built in support for routing and a lot of other features.  In addition, there are a handful of MVC frameworks built on top of CFML such as ColdBox MVC or FW1.  What would make the most sense for organization of these combinations.  My first thought was something like

frameworks/CFML/Lucee
frameworks/CFML/ColdFusion
frameworks/CFML/Lucee-ColdBox
frameworks/CFML/ColdFusion-ColdBox
frameworks/CFML/Lucee-FW1
frameworks/CFML/ColdFusion-FW1
etc...

Though to be honest, I'd just be happy to get cfml/lucee up and working as a starting point.  Can anyone offer direction on what organization would make the most sense and I can see about figuring out how to set up CFML to run inside your testing framework.

Thanks!

~Brad

Brad Wood

unread,
Jan 27, 2021, 1:58:32 AM1/27/21
to framework-benchmarks
I'm certainly liking the way the tests currently work.  I found some old E-mails from 2013 tonight where I was telling my boss I had spent 8 hours and still didn't have the tests working. I was able to spin up the gemini test in about 20 minutes on Docker tonight.  A marked improvement!

I've been playing around with some CFML tests as I discussed in my previous point.  I am however getting a Python error at the end of all the JSON test runs.  Can someone help me understand if this is a bug in the project or an issue with my tests. Right after the "Concurrency: 512 for json" is complete, I get the following output.  It doesn't seem the plaintext test runs at all, even though it shows as verified.

cfml-lucee: Traceback (most recent call last):
cfml-lucee:   File "/FrameworkBenchmarks/toolset/benchmark/benchmarker.py", line 209, in __run_test
cfml-lucee:     self.__benchmark(test, benchmark_log)
cfml-lucee:   File "/FrameworkBenchmarks/toolset/benchmark/benchmarker.py", line 295, in __benchmark
cfml-lucee:     benchmark_type(test_type)
cfml-lucee:   File "/FrameworkBenchmarks/toolset/benchmark/benchmarker.py", line 285, in benchmark_type
cfml-lucee:     results = self.results.parse_test(framework_test, test_type)
cfml-lucee:   File "/FrameworkBenchmarks/toolset/utils/results.py", line 149, in parse_test
cfml-lucee:     rawData["startTime"], rawData["endTime"], 1)
cfml-lucee:   File "/FrameworkBenchmarks/toolset/utils/results.py", line 471, in __parse_stats
cfml-lucee:     if len(main_header[item_num]) != 0:
cfml-lucee: IndexError: list index out of range
cfml-lucee: Error during test: cfml-lucee
cfml-lucee: Total test time: 2m 27s
tfb: Total time building so far: 1s
tfb: Total time verifying so far: 0s
tfb: Total execution time so far: 2m 30s
================================================================================
Parsing Results ...
================================================================================
fatal: not a git repository (or any of the parent directories): .git
Running "cloc --yaml --follow-links . | grep code | tail -1 | cut -d: -f 2" (cwd=/FrameworkBenchmarks/frameworks/CFML/cfml-lucee)
Counted 88 lines of code
================================================================================
Verification Summary
--------------------------------------------------------------------------------
| cfml-lucee
|       plaintext     : PASS
|       json          : PASS
================================================================================

Results are saved in /FrameworkBenchmarks/results/20210127064436

Mike Smith

unread,
Jan 27, 2021, 10:51:36 AM1/27/21
to framework-benchmarks
Glad to hear you're making progress!

These sorts of errors is why I am rewriting the toolset >_<. This *looks* like it's failing in parse_stats, which is probably because `dstat` failed somehow, but I'm not entirely sure.

I would say since you got the verification summary to show passing results that you are most (if not all) the way there. Maybe try opening a WIP pull request to see if your implementation passes verification on our CICD stuff.

Brad Wood

unread,
Jan 27, 2021, 11:16:40 AM1/27/21
to Mike Smith, framework-benchmarks
Thanks for the reply.  I can open a WIP pull and see how it looks.  If it's passing, I'll work on adding some of the DB tests.

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 



--
You received this message because you are subscribed to a topic in the Google Groups "framework-benchmarks" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/framework-benchmarks/U_sQSOBdQiI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to framework-benchm...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/framework-benchmarks/537aebf3-332c-46b8-a4aa-c05879f97e85n%40googlegroups.com.

Brad Wood

unread,
Jan 27, 2021, 7:52:09 PM1/27/21
to Mike Smith, framework-benchmarks

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 

Brad Wood

unread,
Jan 27, 2021, 7:57:24 PM1/27/21
to Mike Smith, framework-benchmarks
Looks like the tests pass.  I did notice that the readme template as well as the pull request readme all mention to add your test to the build yml, but it appears that that is no longer necessary.  I assume those instructions need updated.

Thanks!

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 


Brad Wood

unread,
Jan 28, 2021, 1:06:50 PM1/28/21
to Mike Smith, framework-benchmarks
I have a question about debug mode.  I'm using Docker Desktop and I'd like to be able to monitor a web-based tool that runs in the JVM while the tests are running to tune performance.  If I launch debug mode and map the ports through, I can hit my endpoint just fine in a local browser just fine.

 docker run ... --mode debug --test cfml-lucee

However, then the performance tests are not running!  But when I run the benchmarks like so

docker run ... --test cfml-lucee

then I can't access the ports externally to hit my monitoring tool!

Is there a way to run the tests AND also be able to debug the server inside the docker container at the same time?

Thanks!

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 


Brad Wood

unread,
Jan 28, 2021, 11:02:03 PM1/28/21
to Mike Smith, framework-benchmarks
And... more questions.  I've been all through the wiki and can't find this information.  How do I tell what version of MySQL is being used?  I see a line in the console when I run the test like so:

 Step 17/27 : RUN echo "Installing mysql-server version: $(apt-cache policy mysql-server | grep -oP "(?<=Candidate: )(.*)$")"

But nowhere does that expanded text appear in any of the test run log files.

And a followup-- is there a way to change the version of MySQL?  I'm having issues connecting to whatever version is in use with my JDBC drivers and would like to be able to use an older MySQL version that's compatible.

And final question for now-- where is it documented what the possible databases are?  I can't find any mentions of this in the wiki.  I just had to peruse through the existing benchmark_config.json files to see what other frameworks were using.

Thanks!

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 


Brad Wood

unread,
Jan 29, 2021, 1:14:54 AM1/29/21
to Mike Smith, framework-benchmarks
Ok, I gave up on MySQL for now and just used postresql.  I updated my pull request to have all the basic tests implemented for cfml-lucee.  I couldn't find another language that implemented the "caching" test, so I ignored it for now.


Other than the issue I mentioned in my previous post about not being able to tune the performance due to port access in the docker container, this all seems to be working.  Please review the pull and let me know what the next steps are.

Thanks!

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 


Mike Smith

unread,
Jan 29, 2021, 11:27:15 AM1/29/21
to framework-benchmarks
Right now (though, we are transitioning to a new toolset, so this will change) the MySQL configuration is done through Docker and lives https://github.com/TechEmpower/FrameworkBenchmarks/tree/master/toolset/databases/mysql

This version should be considered locked and the new toolset will be where/how updates to databases are made.

Connecting to your running application container while benchmarking should be doable - whatever port you have specified in your `benchmark_config.json` is the port that is exposed through Docker. If, for instance, you specified 8080, you could run `curl localhost:8080/json` to request your json test.

-Mike

Brad Wood

unread,
Jan 29, 2021, 12:50:41 PM1/29/21
to Mike Smith, framework-benchmarks
you specified 8080, you could run `curl localhost:8080/json` to request your json test. 

Right, and to be clear this works perfectly when I use "--mode debug" and does not work at all when I run the test normally. There appears to be a difference in the benchmark suite when it exposes the ports.  And I need them exposed WHILE the tests are running.  Having them exposed in debug mode does not useful unless there is a way to also trigger the tests to run in debug mode.

Any feedback on the pull? 

Thanks!

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 


Mike Smith

unread,
Jan 29, 2021, 1:08:12 PM1/29/21
to framework-benchmarks
The current toolset should not discern between benchmark and debug modes - basically, the container should be started with the port exposed in debug mode, but started with host mode in benchmark; so, you should be able to connect to it.

Feedback for the pull request will be in the pull request.

-Mike

Brad Wood

unread,
Jan 29, 2021, 1:38:04 PM1/29/21
to Mike Smith, framework-benchmarks
Right, I understand how it's supposed to be working.  What I'm saying is the proof is in the pudding and it doesn't work that way :)

When I start the tests in debug mode with this command (I'm using Docker desktop):

$ docker run -it --rm --network tfb -v /var/run/docker.sock:/var/run/docker.sock -v /D_DRIVE/docs/GitHub/FrameworkBenchmarks:/FrameworkBenchmarks techempower/tfb --mode debug --test cfml-lucee

This is what I get in the browser:

image.png

And when I run the tests normally with this command:

$ docker run -it --rm --network tfb -v /var/run/docker.sock:/var/run/docker.sock -v /D_DRIVE/docs/GitHub/FrameworkBenchmarks:/FrameworkBenchmarks techempower/tfb --test cfml-lucee  

I get this in the browser (while the cfml-lucee container is up and the tests are running):

image.png

So something is clearly different between the debug mode and the regular mode.

Thanks!

~Brad

Developer Advocate
Ortus Solutions, Corp 

ColdBox Platform: http://www.coldbox.org 


Reply all
Reply to author
Forward
0 new messages