Re: [okapitools] Read & Write to/from custom database tables

18 views
Skip to first unread message

Manuel Souto Pico

unread,
Apr 14, 2022, 1:35:42 PM4/14/22
to okapi-users
Dear Yves,

I have read your reply to Bob from back 2012, in the old list. Has anything changed in the middle? Is there a filter in Okapi Rainbow to extract text from SQL files?

If that's not the case, could you share a few tips to put me in the right direction about how to create one? I suppose I should take the plain text filter as a basis? I have a MS SQL dump I need to translate.

Thanks in advance.

Cheers, Manuel


Yves Savourel <yv...@opentag.com> escreveu no dia segunda, 2/01/2012 à(s) 16:09:
 

Hi Bob,

> Are there any tools or JAVA API in the okaptools
> framework which would help with export/import of
> translation units from SQL database tables to
> XLIFF and import it back in.

Not currently.

I did start a Database Filter a couple months ago, but have not had time to get it to a level it's useable yet. Hopefully this is something that can be done in the coming months.

> Any suggestions or pointers how how to
> approach this are greatly appreciated?

Meanwhile, an alternative would be to export to some kind of CSV or tab-delimited format and use that format with the Table Filter. Or, if the database allows it, it export to some XML-based format and use the XML Filter or the XML Stream Filter. The second would be a good solution if the content of some of your fields is in HTML or other format.

Hope this helps,
-yves

------------------------------------

Yahoo! Groups Links

__._,_.___
Recent Activity:
.

__,_._,___

yves.s...@gmail.com

unread,
Apr 15, 2022, 6:33:26 AM4/15/22
to Manuel Souto Pico, okapi-users

Hi Manuel,

 

One way to translate a DB would be to export it to some CSV format and translate that, then import it back again.

It’s obviously not as efficient but often doable.

 

As for creating a new filter: Yes, the main thing is to implement the IFilter interface, and likely IFilterWriter to put the translation back.

Most of the mechanism is likely to go in hasNext() and next() to get the TextUnit one after the other.

With a DB you would not have skeleton parts to deal with as the merging back would likely just re-inject the translation.

Now, extracting text from a DB is relatively simple, but most likely the text itself can be more than plain-text. Often there is HTML, JSON, or other layer of format. So you would probably want to implement some sub-filtering capability, or chain several filters. That part is more tricky. Dealing with inline codes is often one of the most challenging part of a filter.

For the configuration/parameters I suppose you would have some information on what to extract, column name, even SQL statements, whatever allows you to be more flexible on how to deciding what to extract/merge.

There is some documentation about filters and the internal workings of Okapi in the dev docs here: https://okapiframework.org/devguide/index.html. It’s a bit outdated, but still mostly valid information.

 

Cheers,

-yves

--
You received this message because you are subscribed to the Google Groups "okapi-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to okapi-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/okapi-users/CABm46bap8TTgYt%3DywJ8LHwbkQNkTPe8m0UT5-SfBH_PU9D-29A%40mail.gmail.com.

Reply all
Reply to author
Forward
0 new messages