Analytics for custom JSON Web Services

27 views
Skip to first unread message

Sanjeev Devireddy

unread,
Aug 1, 2019, 3:07:35 AM8/1/19
to TopBraid Suite Users
Hi,
  We implemented JSON Web Services by writing subclasses of ui:JSONServices. Now, we want to implement/generate analytics(API Calls per Day/Week/Month/..) for those web services. Any suggestions on how can we implement the analytics for APIs? Is there any out of box crosscutting concern that help us to implement the analytics?
I see an option to log the messages by using the logger ui:log but it writes the messages to a common log file. Is there a way to create a separate log file for the Web Services project that we developed so that it gives the clean logs?


Thanks,
Sanjeev

Richard Cyganiak

unread,
Aug 1, 2019, 3:22:14 PM8/1/19
to topbraid-users list
Hi Sanjeev,

On 1 Aug 2019, at 08:07, Sanjeev Devireddy <deviredd...@gmail.com> wrote:

I see an option to log the messages by using the logger ui:log but it writes the messages to a common log file. Is there a way to create a separate log file for the Web Services project that we developed so that it gives the clean logs?

No, not in the current release. We have a ticket to implement separate special-purpose logs for the next major release.

In the meantime, your best bet might be to use the common log with appropriate filtering as a post-processing step, or perhaps to ping an external service via HTTP to do centralised logging, such as Logstash.

Richard




Thanks,
Sanjeev

--
You received this message because you are subscribed to the Google Groups "TopBraid Suite Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to topbraid-user...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/topbraid-users/d60cd3e2-0255-4fff-9610-151c85237604%40googlegroups.com.

Sanjeev Devireddy

unread,
Aug 8, 2019, 9:04:56 AM8/8/19
to TopBraid Suite Users
Hi Richard,

   For the EDG/TBL common log file(/var/lib/topbraid/ontologies/.metadata/.log), is there a rollover configuration to understand when a rollover (backup) occurs based on file size or current date? If yes then could you please share that configuration location? That would helps us to design/define our log file's filtering as a post-processing step in a better way.


Thanks,
Sanjeev
Reply all
Reply to author
Forward
0 new messages